Importing production DB to live server, error when inserting record

Hello,
I have recently finished a project I would like to test on a live server, the server contained an earlier version of the same project, but due to a major rework, I renamed the DB on the live server (dropping it in the process), and importing the new version from my localhost.
I seem to have created a specific user for the original DB, but changed the PHP config file to contain the name of the database and the login username and password I have for the account (it is a hosted website with cpanel interface, so I do not have admin rights, but with these details have all privileges on the DB).
The website loads, but when trying to create a new account (which adds a row to the user table with the login information) I get a my database query failed error. The table seems to have been set up properly if compared to my local version, so dont think it is with the data I am trying to insert is misaligned with the data types.
I manually added a user row to my user table to test if I can at least access the database from the site, and yes I am then able to login to that user I created in phpmyadmin as well as update it on my user edit page, but if I try add row to another table while logged in (e.g. you can add your address which will add a row to the address table) this again then gives a DB query fail.
I am not sure how to debug this further, any suggestions of what this could be? maybe I need to explicitly set a user when initially creating the new DB with insert rights? I feel I maybe did that with the original site

You shouldn’t have any ‘database query failed error’ message logic in your code. On a live server, the user/hacker that managed to trigger a database error doesn’t need to know anything about what type of error occurred, just that the page they requested didn’t work.

You should use exceptions for database statement errors and only catch and handle the exception in your code for user recoverable errors, such as when inserting/updating duplicate data. In all other cases, you would let php catch and handle database statement errors, where php will use it’s error related settings (error_reporting, display_errors, and log_errors) to control what happens with the actual error information (database statement errors will automatically get displayed/logged the same as php errors.) If you had done this, the actual database statement error would get logged on your live system and you could either check the error log file, or temporarily cause php errors to be displayed, which would also cause the database statement errors to be displayed.

Thanks, I have replaced this with a die() and redirect statement.
For security I still need to redo the functions to prepared statements too.

I found the issue to be mysql error no 1366.

It seems I get an error trying to insert my PK field (id) as ‘’ is an invalid integer, but that is still confusing as the DB is set to A_I, this was not an issue on my localhost so not sure what to make of that, is there a server setting to bypass this and have the DB just A_I, or something in the insert function could unhappy with the new server? Should I post all the code for inserting?

You should not include an auto-increment field in an insert query.

Yes, this seems to be the issue.
The original function looks to see if a DB match can be made, if yes, then it updates the record, for non existing records it then should call the save function, so this seems to be the issue, the save function is pulling an object together first with the details, and as id is blank, it still adds id = ‘’ to the query.
This was not an issue on the localhost as the option for Strict Mode was off and allows these queries to still insert.
So after my research it seems I will need to focus my attention now on updating first my pages that insert data to call an actual insert statement, and will do these with prepared statements.

When multiple concurrent instances of your script are all doing this, a race condition exists where all the instances will either find that a matching row does or doesn’t exist, and will all attempt to update or insert a new row. The first instance of your script to reach the point of executing the update or insert query wins the race. The other instances will either overwrite the pervious data or insert duplicate data.

Instead, the appropriate column(s) in your table should be defined as unique index(es) or as a composite unique index. Then use a single INSERT … ON DUPLICATE KEY UPDATE … query, without any SELECT query first, to accomplish this type of operation.

Thank you all, the prepared statements seem to have worked, also checking through all page headers to made make sure there is no white space (seem to forget about that when using localhost) didn’t do hard.
Much appreciated!

Sponsor our Newsletter | Privacy Policy | Terms of Service