Access Databse

IMATech

Beta member
Messages
2
Location
DE, United States
I am a recent grad and new to the industry. Currently, my company has an Access Database that is about 1.4 GB. This database is accessed via a shared network folder and is fairly sluggish. I am trying to find what the bottleneck is, using performance monitor I see that none of the computers hardware goes above 20% CPU usage and never spikes above 40% memory usage. I want to inspect the network traffic but I am not part of the security department so can't exactly use Wireshark or other tools I would be familiar with and would rather not have to work with them to do so. I am not 100% sure how Access works as far as syncing. This setup is obviously sub-par and makeshift so it's a bit of an odd issue. All of the I/O looks fine. When someone opens the database a temp copy is pulled to and saved to their computer until it is shut down. When clicking on dropdown lists in the form or running queries the databse seems to be sluggish. My theory is that because of the setup the database copies are referring lots of requests back to the central copy to pull and update information so is being bottlenecked by Access itself and/or the network (5-6 people use this at a time). Can anyone confirm my thoughts or provide an alternative explanation and hopefully explain how Access would interact in this environment? Thanks! (Updating to a different database is not an option currently).
 
1.4 GB is really really large, especially for something shared on the network. You may not have a specific bottleneck, that file is just stupid big for access. Sounds like maybe this was a flat file that just got out of hand.

Per MS the max size of an access DB is 2GB. I know you aren't there yet but that's really large.

Personally would suggest one of two things:
1. Migrate to another database format. SQL is a much better solution for what you are using. You can still have the same 'applets' that allow people to access their data and from a management perspective it would be a much more sound format
2. Break the data up. Not sure how you have the tables and data setup, but is there any segregation that can happen with this data? Is it one large, flat file or are there multiple tables in here that all link together? If they are linked can you parse the tables out to multiple files so that you aren't buffering 1.4gb (well not that much but an entire tables worth of data) each time you access something?
 
1.4 GB is really really large, especially for something shared on the network. You may not have a specific bottleneck, that file is just stupid big for access. Sounds like maybe this was a flat file that just got out of hand.

Per MS the max size of an access DB is 2GB. I know you aren't there yet but that's really large.

Personally would suggest one of two things:
1. Migrate to another database format. SQL is a much better solution for what you are using. You can still have the same 'applets' that allow people to access their data and from a management perspective it would be a much more sound format
2. Break the data up. Not sure how you have the tables and data setup, but is there any segregation that can happen with this data? Is it one large, flat file or are there multiple tables in here that all link together? If they are linked can you parse the tables out to multiple files so that you aren't buffering 1.4gb (well not that much but an entire tables worth of data) each time you access something?

I was thinking the same thing here. There's SQL Server Express if you need a free solution but want to stick with Microsoft products; I believe their limit per database is 10GB.
 
I new the limit was 2 GB but I didn't know the database was actually already split. The front-end is about 1.4 GB like previously stated and the back-end tables are a good 4GB. I think this is probably what's slowing it down so much. It doesn't help that it runs through a switch before getting to the shared folder. I originally suggested SQL but my boss didn't want to switch. Looks like I have some convincing to do. Thanks for the advice guys!
 
Ya, even with a Gig switch you are still going to run into latency with that amount of data. SQL Express would be your best bet IMO. Maybe do some tests, migrate some data and prove that the latency is fixed when using SQL.
 
You can carve up the database to multiple locations and use link tables.

Whenever I create an MS Access application that is user facing I create at least two files. There is a storage access db that holds all of the data in its tables.

I then have a client version that has all of the forms and VBA code. All of the tables in it (other than smaller ref tables) are link tables that point back to the db that holds the actual data. This way when I load the client for the first time it isn't multiple GB and it starts up a lot quicker.

The only time that I have to load a lot of data is on queries and other transactions which should increase performance by quite a bit.
 
Great idea, that would also allow you to keep the current Access architecture but will help with loading times. The only issue you may run into is actually setting up the database links to each table.
 
Back
Top Bottom