Patrick Monaghan
Patrick Monaghan

Reputation: 53

How to pass many (over 100) variables into PHP

I know about the conventional $_POST methods of getting variables into PHP. However, is that still the most effective solution of passing variables if the number of variables is so high?

Currently in my code the user inputs information into over 100 input texts and then this information is passed into PHP using the $_POST method. However, I don't think this will fare too well with many servers.

Essentially it would be great if anyone could tell me the ideal way of passing large amounts of variables to the server. Thanks in advance.

EDIT:

For all the people saying it would simply be a UI disaster, that's something that's been consulted many times. However, the submit page is essentially to be the heart of the website and there are reasons why the user would not be dissuaded by the high amount of inputs- in fact the default number is 20 but the list can be expanded to 100 inputs which is an option implemented FOR the user.

Thank you for all constructive replies, they are very helpful.

Upvotes: 3

Views: 2536

Answers (6)

astimen
astimen

Reputation: 1

You can use xml variable that post through ajax request such as MVC concept. First you set a mount of xml variabel. Second you post this xml variable as a text. View PHP post in through ajax request and model php will recieve xml text data and decompose it with xml query to get the same amount of variable.

Upvotes: 0

Bailey Parker
Bailey Parker

Reputation: 15903

This kind of load is certainly manageable depending on how much data you expect each field to contain, or rather, the maximum amount of data you determined each field can contain. In PHP, the maximum size of the body of an HTTP POST request (that's the part that contains the form encoded values) is determined by the ini value post_max_size. It has a default of 2MB, but you can change this in your php.ini:

post_max_size = 10M # megabytes

Or in your .htaccess:

php_value post_max_size 10M

Take care when setting this because it should be no more than the amount of RAM available on your system. Also consider that you could have multiple users requesting this page, and if each of them gets an exorbitant amount of RAM allocated for their request, they could hang or crash your server.

However, consider the math here. Even if you had 100 fields each with 20 bytes in them, that would only be 2000 bytes which is about 2 KB. Even with a 1 Mbps upload speed, which is pretty slow, you users will be able to upload 128 KBs per second. At this speed, each of the 100 fields would have to contain 1311 bytes of data for the upload process to take 1 second.

On Apache, the default timeout is 300 seconds, so your form fields would have to contain a combined total of 37.5 MBs before Apache would time out. This setting might be slightly altered by your host (or your server admin) and is probably set to a more reasonable value such as 30 seconds. But still at this limit, you would need 3.75 MB of data which is likely way more than what 100 fields can contain.

You should also not be concerned about the clientside, because even the stingiest browser (IE) limits POST uploads to 2 GB.

Basically, my point here is that even with a slow connection, HTTP and your server are well capable of handling that many fields. I'm not sure how long it would take for PHP to parse all of them (you'd have to benchmark it on your server), but I imagine the impact will be negligible.

From a user's standpoint, I'd say that 100 fields would be a pretty daunting sight. If at all possible, it might be nicer to separate your form into friendlier and smaller steps that walk the user through the process of filling out the form. If you would rather not split the form into steps, at least look into saving the state of the form with javascript. Note that in the W3C recommends 5MB of storage space for localStorage, so this should be plenty of space to store all of your fields. Also look at this fallback that uses cookies, but be weary. Cookies have more limits than localStorage. I've read that cookies are limited to 4KB each and 20 cookies per domain. You might want to distribute your stored form fields into several cookies, say 10 form fields in 10 cookies. You can store multiple input values in a cookie using encodeURIComponent():

var inputs = document.forms[0].getElementsByTagName('input')
    i      = 0,
    date   = new Date(),
    expires;

// Expires date (1 day in future)
date.setTime(date.getTime()+(24*60*60*1000));
expires = date.toGMTString();

for(var cookieNumber = 0; cookieNumber < 10; cookieNumber++) {
    var cookie = [];
    for(; i < (cookienumber * 10 + 10); i++) {
        cookie.append(encodeURIComponent(inputs[i].name) + '=' + encodeURIComponent(inputs[i].value));
    }

    document.cookie = 'savedForm' + cookieNumber + '=' + cookie.join('&') + '; expires =' + expires;
}

To ensure that everything is saved as the user types it in, you might want to update your stored data onchange or, if you want up to the second saves, onkeyup.

Also, as another convenience to the user, when the form is submitted, all saved cookie and localStorage form field data should be cleared so when they visit the form again all of the fields will be empty and read for new data input.

Upvotes: 2

ghoti
ghoti

Reputation: 46876

You no doubt have the UX issues under control, or you'd be asking this question in User Experience.

From a performance of standpoint, let's consider a few strategies for passing this data.

1. HTTP POST (variables passed as application/x-www-form-urlencoded)

The total size of your submission is easy to predict, but depends on the content in your form. If you're using just selects and checkboxes, or hidden variables with integers or booleans, 100 elements might turn into, say, 2KB of data. How long does it take to pass 2KB of data? Not long. Obviously, text fields change things (also in ways that you, but not we, can predict), but probably not significantly. The encoded string var=Pack+my+box+with+five+dozen+liquor+jugs is 44 characters. 100 of them would still generate less than 5KB of data to upload. Even at analog modem speeds (say, 28.8Kbps or 2.8KB/s), that's only a few seconds. And file attachments move us on to the next strategy.

2. HTTP POST (variables passed as multipart/form-data)

You'll use this if you're uploading files, and probably not for much else. If you plan to use your form to upload 100 4MB JPEG images, then I suggest you streamline things by implementing a JavaScript based background uploader. Let the upload happen while the user is searching for their next file. Other than that, the total amount of overhead for this format is tremendously greater than for x-www-form-urlencoded, but the overhead is still likely negligible compared to the size of your attached files.

3. HTTP GET (variables passed as part of the HTTP request)

This is basically a x-www-form-urlencoded string tagged on to the end of a URL. See #1.

The other major factor you should consider is how quickly you will process this form. The upload time may be small, but if each form element requires 3 seconds of database access and processing, then 100 items causes a 5 minute wait after the form is submitted. Without knowing what your form content is or what you're really trying to achieve here, I can't provide advice on this matter other than to say "consider this".

Upvotes: 1

Christofer Eliasson
Christofer Eliasson

Reputation: 33865

Wouldn't it be better to break it down into several requests, by forms divided over several steps?

Displaying 100 fields to a user sounds really bad from a user experience perspective. How about making it into multiple steps? Less overwhelming to the user, and perhaps easier for you to manage from a server-side perspective as well.

Let's say the user closes the window by accident, looses her Internet connection or some other issue occur, that causes the data to not be sent to the server. If that happened to me after completing 100 input fields, I would probably leave the page and never come back.

From a performance perspective, I don't believe it should make much difference, but I would really recommend to take another approach anyway.

Upvotes: 2

Tessmore
Tessmore

Reputation: 1039

100 shouldn't be that much of a problem. But you can group input fields by using '[]'

Thus:

Apple
<input type='checkbox' name='fruity[]' value='apple'>

Pear
<input type='checkbox' name='fruity[]' value='pear'>

Banana
<input type='checkbox' name='fruity[]' value='banana'>

And if you check banana and pear, then submit. It results in

$_POST = array( 
  0 => fruity => array ( 
    0 => 'pear', 
    1 => 'banana'
   ) 
 )

Upvotes: 0

Tadeck
Tadeck

Reputation: 137420

Standard method of passing data using POST requests where you have them available as part of $_POST array is perfectly fine. This method has very high upper limit of amount of data that can be sent.

If you want something more flexible, you can serialize the data first, send it, and then retrieve this data using http_get_request_body(). But you should not need that. Forms are perfectly fine for the scenario you are describing.

Upvotes: 0

Related Questions