PHP and Search Engine Optimization


My next task with my site is improving its Search Engine Optimization (SEO). It’s really bad as it is now. For further reference, this is the site:
Coming up 3rd in Google when the title of the site is searched is actually pretty good considering that my Facebook page is 2nd and there are a LOT of people with identical business titles due to how common my name is. However, what I want is for other search terms to be able to find my site. And I want it to be able to find pages within my site, not just the front page.

Based on what I have read from here:

It appears that including parameters in the URL is damaging to the ability of the page to be found by search engine crawlers. If you visit my site, you’ll notice that all content is at the base-level of the site. Clicking links doesn’t actually open a different file; it just reloads the same index.php with a different parameter, and the index.php uses that parameter to determine which content to combine together to create a webpage.

It looks like, if I want to maximize the ability of my site to be found by search engines, both the front and the pages within, then I need to actually have separate files for each page, and they need descriptive titles, descriptive URLs, and smart use of keywords within the page content. I can do all of that. It’ll be a couple hundred pages, but I can just write a function on the admin side of the site to completely automate the process of generating all the pages from the existing MySQL database.

However, I have a question regarding this. One of the other things my site does is use a header.php and footer.php. I really, really, don’t want to have to copy the header file (which has to handle data for the shopping cart) into every page of the site. If every page has a unique php file, but the beginning of the file still has #include for the header and the end of the page has #include for the footer, will that cause any issues with SEO?


I don’t have time to go into detail, but that is incorrect. Amazon does not have a billion pages, it is all database driven.


It looks like I’ve misunderstood some of what was being said in the link I gave. The issue isn’t that the page content is generated dynamically from a database, but that the URL contains SEO-unfriendly syntax. It looks like mod_rewrite in the .htaccess file is the solution I’m looking for. It actually mentioned that in the SEO guide I posted a link to but I must have glossed over it. If I edit this file so that my URLs are descriptive and without the 3 characters (?, =, and &), it stands a much better chance of being fully explored by the crawlers beyond the front page. There are a few other things that need to change in the site for this to work properly.


Use a templating system like League Plates or Twig.

You’ll define 1 page as the main template which contains all the html code for header and footer has has special indicators in the sections where page-specific content would be inserted.

Your individual page .php files will contain only the page-specific content of those pages and not have any php code lines related to header and footer, they will simply indicate the Template file they want to use and the content that should be inserted into various parts of the template.

Then you’ll use a Router class like League Routes to define all your urls and which php file should be loaded or which php function should be run for a specific url.

The only thing .htaccess will do is direct all urls to your index.php file which does the work of figuring out what to show.