I have a Sage 100 user that is having extremely slow performance issues when running out intelligence reports.

For example, she sent me an email this morning that the GL Transaction Report 2-6 had been running for 3 hours and still hadn't completed. She had filtered the data to only transactions from 01/01/2018 forward.

I have seen a very old post about changing the Alchemex.ini file to track and determine what is causing slow down.

Can someone please let me know if this is still a valid possibility? Also, there is reference to a BICORE.exe version. What does that need to be.

I am also wondering if there are other options to increase speed or identify if it is a network issue. They are very frustrated, especially since you can't perform other tasks while this runs in the background.



Views: 308

Reply to This

Replies to This Discussion

Hello Suzanne,

This article could be helpful to pinpoint what is causing the slow performance:


Your BICORE.exe will depend on which version of Sage 100 you have, since each  version is packaged with its own BICORE.exe

Hope this is helpful.

What's the amount of RAM on the workstation she runs the report on?

Also, how many records are in the GL_DetailPosting file?

Attached is a .jpg showing her laptop setup - 64-bit os, 8 GB RAM. She said that she uses the network connection and not wireless most times, but doesn't see much of a difference.

I'll have to find out how may records are in the GL_DetailPosting file, but if she's only selecting those since 01/01/2018, shouldn't this make the report run faster?




The RAM has a lot to do with the speed of generating reports. It's when the report is working with the Excel Template that uses all of the RAM.

Had a client contact me wanting to see if I could speed up a report they built. On their computer the report took around an hour to generate. They had 8GB of RAM. They sent me the report along with their data. Set it up on my laptop and it took 2 minutes. I have 32GB of RAM.

Can you try running the report on a computer with more RAM?

The filter helps with the Excel part of the generation but it still needs to read every record in every file to do the filtering.

I have done some testing on this and Sage intelligence, Sage 100, and Sage 100 odbc are all I/O bound not memory bound. If you are running across a network there are a some items that you can check to help speed up slow reports. 

  1. Make sure you Sage 100 servers have fast drives SAS for more than 20 users, ideally SSD.
  2. Make sure you are connecting sage 100 via a 1000Mbps network or better. Real life a 1000Mbps ethernet will run between 650Mbps and 800Mbps. Sage 100 make connections via the windows Fileserver protocol SMB. So when you are in AR and list out your customer listing that listing is populated via file-sharing SMB.
  3. With point 2 don't use wireless for connecting to sage. Wireless Access Points often times connect slower than 100Mbps and are connected to a network via a single 100Mbps cable. So if you have just ten users on an access point looking up customers or writing sales orders then they are all using that 1 100Mbps at peak usage that's only 10Mbps per workstation and real world that's about 3/4 of what could be.
  4. For those with windows 10 although there could be security risks involved disable SMB v3 on the windows. This will force Windows 10 to use SMB v1 which I have seen in some windows 10 builds to significantly increase performance using sage. 
  5. If you work for a  small business with just a couple of employees using sage intelligence try running the sage intelligence right from the server using RDP or physical access. 

I know the above is a bit technical. I should document the steps and put them online. I have used all the steps above to help performance of Sage 100, Sage intelligence and Sage ODBC connections. 


The Sage Intelligence Blog

Like Sage Intelligence?

Follow @SageSupport

© 2018   Created by Sage Alchemex.   Powered by

Badges  |  Report an Issue  |  Terms of Service