Knowledgebase: Data & Table
[QODBC-Desktop] QODBC Slow or Fails on Large InvoiceLine DataSet
Posted by Juliet (QODBC Support) on 03 August 2010 08:02 AM

Troubleshooting: QODBC Slow or Fails on Large InvoiceLine DataSet

Problem Description 1

      I have tried several ways (unsuccessfully) to get a complete data dump of the InvoiceLine data from QB Ent. 6 Mfg. Ed. There are over 100K records in that data set. I had the client upgrade to QB Ent. 2006 from QB 2005 Pro just for this reason. I have tried the following methods for getting this data set:

      SQL Server DTS copy to a SQL Server Table; MS Access Linked Table with a customized query; Crystal Reports using a Command; Crystal Reports using a Table; VB Demo.

      I have tried both optimized and unoptimized queries. They are all failing at one point or another. I did get SQL Server DTS to do one complete dump which took over 6 hours! Since then I have not even been able to get that to work again.

      Since then I have Cleaned up the company data, made a portable file then re-opened it to reduce the file size and performed a re-build on the data several times. I have also re-installed QODBC a few times and also changed the optimizer directories in order to reset the optimizer which reports "Data Load Successful" when in fact it hasn't done anything.

      When running an unoptimized query, the driver status panel will appear and seems to get stuck at "Open Table" for an extended period of time or sometimes indefinitely. WHAT AM I DOING WRONG?

Solutions 1 - Narrow Date Range

     When QODBC calls QuickBooks, QuickBooks is trying to return a HUGE XML document of 100,000 records here and you are clearly using up all the system resources on your computer.

I suggest you extract all the information year by year you require by running year by year subset queries instead like this:

      select * from InvoiceLine UNOPTIMIZED WHERE Txndate>= {d '2009-01-01'}and Txndate< {d '2010-01-01'}

Optimizer Related Information

      See: How to setup QODBC Optimizer and where are the Optimizer options for all the details about optimizer.

      By default, the optimizer will update new and changed entries in a table from QuickBooks first and then execute the query against the locally optimized table. This is faster than reading everything out of QuickBooks every time, especially the more data you have.

      If you extract all the information year by year you require by running year by year subset queries instead like above, you will also update your optimized table.

Solutions 1 - Reset Optimizer File

     Sometimes, the queries which could work fine before crashes or become slow suddenly, it always indicates that your optimizer file is corrupted. Once you are corrupting your .opt file, you don't have to reinstall QODBC, see: How to switch OFF or RESET the QODBC Optimizer for more.

Reset and Reload Information

Questions:

1. How long should a full re-build take?

2. (The QB Data file is approximately 300MB in size) If an Invoice record (for example) is added or changed, will the optimizer kick in to update ONLY THE NEW new/changed records to the optimized data set or does it need to completely re-optimize all of the records in the tables affected by the addition/change of the record? The actual physical activity and time frames that I have experienced do not seem to match with the "update changed records only" description that I received in an earlier post. Your advice on these issues has continued to be to "run UNOPTIMIZED queries". Is there a reason for this? Can I expect to have problems when running optimized queries while users are actively using the system?

Answers:

1. With large company files, we don't recommend a full load of all the data at once. In fact, we recommend optimizing the tables you require on a table by table basis only. For example, for the InvoiceLine table you can fully resync your optimized InvoiceLine table by running:

sp_optimizefullsync InvoiceLine

2. By default, the QODBC optimizer will update new and changed entries in a table from QuickBooks first and then execute the query against the locally optimized table. This is faster than reading everything out of QuickBooks every time, especially the more data you have. Please keep in mind that the QODBC Optimizer .opt file is local to your computer (single user) only and it can't be shared with other users.

   I had used the UNOPTIMIZED tags to call QuickBooks directly because the basic rule is to always read historical information out of the .opt file (NOSYNC) and to read very new data directly out of QuickBooks (UNOPTIMIZED).

 

 

Problem Description 2

     I am resigned to the fact that the InvoiceLine Table takes up to 5 hours to completely optimize on this system because it is so large. What I don't understand is why the optimizer insists on Optimizing the "ReceivePaymentLine" table when I try to access the optimized InvoiceLine Database. There are no references in my query to that table AT ALL. My query is simply "SELECT * FROM INVOICELINE NOSYNC". Why would it try to optimize that table when there is no reference to it at all? Does this mean that the OPT file is corrupt? If so then there must be an issue or bug with large datasets (over 100K records)? Again, I am stumped.

Solutions 2 - sp_optimizefullsync and sp_optimizeupdatesync

ReceivePaymentLine with Optimizer Related Information

     We look at the ReceivePaymentLine internally to check if there are any payments that need to be updated against InvoiceLine stored in the optimized table.

     Because you did "SELECT * FROM INVOICELINE NOSYNC", the SELECT * forces a ReceivePaymentLine lookup to make sure open balances are correct and because ReceivePaymentLine hasn't been optimized it started it I guess even with a NOSYNC.

     So I suggest you optimize your ReceivePaymentLine table too:

sp_optimizefullsync ReceivePaymentLine

Optimizer Update Information

     Once InvoiceLine and ReceivePaymentLine tables have been optimized you just need to execute a query on InvoiceLine to update it or you can update it manually at any point in time by doing:

sp_optimizeupdatesync ReceivePaymentLine

sp_optimizeupdatesync InvoiceLine

Note: Once your InvoiceLine, SalesOrderLine and ReceivePaymentLine tables have been optimized, QODBC by default will just update the .opt file when you do a query on the table. So you only have to use sp_optimizeupdatesync if it's part of a batch reporting or updating process or something. You don't need to run it every night to use QODBC.

     What you now need to do is NOT run full Select * queries for every column and every row in a table anymore. Imagine if QuickBooks did that? It would be very slow.

Solutions 2 - Iterator or Chunk Size

      Sometimes, because of the large record sets being used, the Iterator or Chunk Size in the Advanced Tab in the QODBC Setup Screen should be setup up to accommodate the largest returned recordset. The default is 500, and it can be set to any value up to 100,000.

(149 vote(s))
Helpful
Not helpful

Comments (0)
Post a new comment
 
 
Full Name:
Email:
Comments:
CAPTCHA Verification 
 
Please complete the captcha below (we use this to prevent automated submissions).