I know there is a lot written on Drupal and tables that have tens of thousands and even millions of records but everyone has a different solution and I want to bring up a specific case.
I am using the log module (https://www.drupal.org/project/log) for Drupal 7 to save detailed custom log messages about a process that I have created for an internal system running on a small ubuntu VM in the Azure cloud. My bundle for the log entity has three custom fields. At 1.5 Million records with 4 custom fields that is 6 Million Records and we are growing every time the process iterates.
As you can imagine the site is slowed down to a near stop when making queries against the log table.
I am thinking of writing a little module to put the entities into a table that is not part of Drupal or exporting the data into a CSV file.
Is there a better way?