[QODBC-Desktop] QODBC Slow or Fails on Large InvoiceLine DataSet
Posted by Juliet (QODBC Support) on 03 August 2010 08:02 AM

Troubleshooting: QODBC Slow or Fails on Large InvoiceLine DataSet

Problem Description 1

      I have tried several ways (unsuccessfully) to get a complete data dump of the InvoiceLine data from QB Ent. 6 Mfg. Ed. There are over 100K records in that data set. I had the client upgrade to QB Ent. 2006 from QB 2005 Pro for this reason. I have tried the following methods for getting this data set:

      SQL Server DTS copy to a SQL Server Table; MS Access Linked Table with a customized query; Crystal Reports using a Command; Crystal Reports using a Table; QODBC Support Wizard.

      I have tried both optimized and unoptimized queries. They are all failing at one point or another. I did get SQL Server DTS to do one complete dump, which took over 6 hours! Since then, I have not even been able to get that to work again.

      Since then, I have Cleaned up the company data, made a portable file, re-opened it to reduce the file size, and performed a rebuild on the data several times. I have also reinstalled QODBC a few times and also changed the optimizer directories to reset the optimizer, which reports "Data Load Successful" when it hasn't done anything.

      When running an unoptimized query, the driver status panel will appear and seems stuck at "Open Table" for an extended period or sometimes indefinitely. WHAT AM I DOING WRONG?

Solution 1 - Narrow Date Range

     When QODBC calls QuickBooks, QuickBooks is trying to return a HUGE XML document of 100,000 records here, and you are using up all the system resources on your computer.

I suggest you extract all the information year by year you require by running year-by-year subset queries instead like this:

      select * from InvoiceLine UNOPTIMIZED WHERE Txndate>= {d '2009-01-01'}and Txndate< {d '2010-01-01'}

Optimizer Related Information

      See: How to set up QODBC Optimizer and where are the Optimizer options for all the details about optimizer.

       The optimizer will default update new and changed entries in a table from QuickBooks first and then execute the query against the locally optimized table. This is faster than reading everything out of QuickBooks every time, especially the more data you have.

      If you extract all the information year by year you require by running year-by-year subset queries instead of the above, you will also update your optimized table.

Solutions 1 - Reset Optimizer File

     Sometimes, the queries which could work fine before crashing or becoming slow suddenly, it is always indicated that your optimizer file is corrupted. Once you corrupt your .opt file, you don't have to reinstall QODBC. See: How to switch OFF or RESET the QODBC Optimizer for more.

Reset and Reload Information


1. How long should a full rebuild take?

2. (The QB Data file is approximately 300MB in size) If an Invoice record (for example) is added or changed, will the optimizer kick in to update ONLY THE NEW new/changed records to the optimized data set, or does it need to completely re-optimize all of the records in the tables affected by the addition/change of the record? The actual physical activity and time frames I have experienced do not seem to match the "update changed records only" description I received in an earlier post. Your advice on these issues has continued to be to "run UNOPTIMIZED queries." Is there a reason for this? Can I expect to have problems running optimized queries while users actively use the system?


1. With large company files, we don't recommend a full load of all the data simultaneously. We recommend optimizing the tables you require on a table-by-table basis only. For example, for the InvoiceLine table, you can fully resync your optimized InvoiceLine table by running the following:

sp_optimizefullsync InvoiceLine

2. By default, the QODBC optimizer will update new and changed entries in a table from QuickBooks and then execute the query against the locally optimized table. This is faster than reading everything out of QuickBooks every time, especially the more data you have. Please remember that the QODBC Optimizer .opt file is local to your computer (single user) only and can't be shared with other users.

   I used the UNOPTIMIZED tags to call QuickBooks directly because the basic rule is to always read historical information out of the .opt file (NOSYNC) and to read very new data directly out of QuickBooks (UNOPTIMIZED).



Problem Description 2

     I am resigned that the InvoiceLine Table takes up to 5 hours to optimize on this system because it is so large. I don't understand why the optimizer insists on Optimizing the "ReceivePaymentLine" table when I try to access the optimized InvoiceLine Database. There are no references in my query to that table AT ALL. My query is "SELECT * FROM INVOICE LINE NOSYNC". Why would it try optimizing that table when there is no reference to it? Does this mean that the OPT file is corrupt? If so, must there be an issue or bug with large datasets (over 100K records)? Again, I am stumped.

Solutions 2 - sp_optimizefullsync and sp_optimizeupdatesync

ReceivePaymentLine with Optimizer-Related Information

     We look at the ReceivePaymentLine internally to check if there are any payments that need to be updated against InvoiceLine stored in the optimized table.

     Because you did "SELECT * FROM INVOICE LINE NOSYNC," the SELECT * forces a ReceivePaymentLine lookup to ensure open balances are correct. Because ReceivePaymentLine hasn't been optimized, it started it, I guess, even with a NOSYNC.

     So I suggest you optimize your ReceivePaymentLine table too:

sp_optimizefullsync ReceivePaymentLine

Optimizer Update Information

     Once InvoiceLine and ReceivePaymentLine tables have been optimized, you need to execute a query on InvoiceLine to update it, or you can update it manually at any point in time by doing the following:

sp_optimizeupdatesync ReceivePaymentLine

sp_optimizeupdatesync InvoiceLine

Note: Once your InvoiceLine, SalesOrderLine, and ReceivePaymentLine tables have been optimized, QODBC, by default, will update the .opt file when you query the table. So you only have to use sp_optimizeupdatesync if it's part of a batch reporting or updating process or something. You don't need to run it every night to use QODBC.

     What you now need to do is NOT run full Select * queries for every column and every row in a table anymore. Imagine if QuickBooks did that. It would be prolonged.

Solutions 2 - Iterator or Chunk Size

      Sometimes, because of the extensive record sets being used, the Iterator or Chunk Size in the Advanced Tab in the QODBC Setup Screen should be setup up to accommodate the largest returned record set. The default is 500, and it can be set to any value up to 100,000.

(149 vote(s))
Not helpful

Comments (0)
Post a new comment
Full Name:
CAPTCHA Verification 
Please complete the captcha below (we use this to prevent automated submissions).