How to optimize?

I make a lot of requests in my db from php, some of them is triggered by a javascript.
For example I have one page that makes a table in html, each row shows a calculation of the result from the fetch in db.

The table where I fetch the result is big about 50000 rows and has more than 200 columns but I only fetch the neccesary columns.

So I ended up with a click on a button triggering the first fetch by jquery/ajax, on result the next jquery/ajax is triggered, until all result on the page is done.

Even though it is a big improvement from making a FULL result it is still heavy on the server.

My general question is, how do I most efficiently handle large searches in my db?

I am familiar with html, php, javascript and css.

Thanks in advance!

So whats the actual problem? 50k is not much for a database server, especially without talking about expected performance, request frequency and current delivery time. Maybe you can optimize your queries if they run “slow”, use EXPLAIN for that. Also the DB server should cache results. You can try MATERIALIZED VIEWS if your DB server supports them - like PostgreSQL. And you should check if neccessary indexes are set. Also you can change to a more specific DB server like REDIS for in-memory data or Elastic Search / Arango DB.

This is actually a very tiny database, the column count is curious and shows signs that normalization isn’t there, but the actual record count is very small.

You don’t mention what you are searching for or how. Full text searches take longer that direct results. Searching columns that aren’t indexed causes slow times as well, but that doesn’t mean you need to index everything because that will create its own performance drain.

Red Flag. Post your DB Schema. As mentioned, sounds like DB is not normalized.

Sponsor our Newsletter | Privacy Policy | Terms of Service