The best way to get data from clean and dirty URLs

I am writing an application that receives data from URLs, but I want to make it an option, whether the user is using "clean" URLs (eg: http://example.com/hello/world ) or "dirty" URLs (for example: http://example.com/?action=hello&sub=world ).

What would be the best way to get variables from both url schemes?

+1


source to share


4 answers


If your mod_rewrite has the following rule:

RewriteRule ^hello/world /?action=hello&sub=world [NC,L]

      

or, more generally:

// Assuming only lowercase letters in action & sub..
RewriteRule ^([a-z]+)/([a-z]+) /?action=$1&sub=$2 [NC,L]

      

then the same PHP script is called, with the variables $_REQUEST

available regardless of how the user accesses the page (dirty or clean url).

We recently moved a significant portion of our site to clean up URLs (supporting older, dirty URLs), and rules like the one above meant we didn't have to rewrite code that relied only on parameters $_REQUEST

, only mod_rewrite rules ...



Update

Mod_rewrite is an Apache module, but there are a number of options available for IIS .

Regardless of which web server you choose to support, the mod_rewrite approach will most likely result in a minimum amount of work for you. Without it, you probably have to create file uploads to mimic the structure of your clean urls, for example. in the root web server, you will create a directory hello

by placing a file in it world

containing something like the following:

// Set the $_REQUEST params to mimic dirty url
$_REQUEST['action'] = 'hello';
$_REQUEST['sub'] = 'world';
// Include existing file so we don't need to re-do our logic
// (assuming index.php at web root)
include('../index.php');

      

As the number of parameters you want to cleanly process increases, so does the number of directories and stub files you need, which will greatly increase your maintenance load.

mod_rewrite is designed for this kind of problem and is now supported on both IIS and Apache, so I highly recommend going in that direction!

+3


source


If you are running your application on an Apache server, I would recommend using mod_rewrite.



Basically, you code your application to use dirty URLs internally. I mean, you can still use "clean" urls in templates, etc, but when parsing the url, you are using the "dirty" version. For example, you are real and the "diry" url is www.domain.com/index.php?a=1&b=2, inside your code you would still use $ _GET ['a'] and $ _GET [' B ']. Then, with the power of mod_rewrite, simply enter URLs such as www.domain.com/1/2/ into the dirty URL. (this is just an example of how everything can be done)

+1


source


sounds like a pain, but my guess is to create a function that parses the url for every request. First, determine if this is a dirty or clean URL. To do this, first I have to look for the presence of the question mark symbol and navigate from there (additional checking will obviously be done). For dirty URLs, use the normal PHP lookup methods ($ variable_name). For "clean" I used regular expressions. This would be the most flexible (and efficient) way to parse URLs and extract potential variables.

0


source


The quick and dirty way might be to just check the GET variables. If they are, it is dirty, if not, then it is clean. Of course, it depends on what exactly you mean by dirty urls.

0


source







All Articles