+ Reply to Thread
Results 1 to 9 of 9

Thread: Slow processing speed w/ large geodatabase

  1. #1
    Jan Eberth
    Join Date
    Feb 2011
    Posts
    5
    Points
    0
    Answers Provided
    0


    0

    Default Slow processing speed w/ large geodatabase

    ESRI users and staff,

    I have a major problem with using data from a file geodatabase. I have created a map that uses data from a geodatabase around 600 GB (HUGE, I know). I don't think I should compress it, because I still need to do a lot of edits on various feature classes. The file is so big that it takes forever to load my .mxd, do simple calculations in field calculator, delete a variable from a table, or export to other formats (e.g., shapefile). In fact, most of the time, ArcMap stops responding altogether or gives me error messages. Please help! I don't want to start over from stratch. I've been working on this database almost a year.

    Any ideas???

    Sincerely,
    Jan E.

  2. #2
    Jake Skinner

    Join Date
    May 2010
    Posts
    1,605
    Points
    1130
    Answers Provided
    180


    0

    Default Re: Slow processing speed w/ large geodatabase

    Are you storing Raster and Vector data within the File Geodatabase? What version of ArcGIS are you using?

  3. #3
    Jan Eberth
    Join Date
    Feb 2011
    Posts
    5
    Points
    0
    Answers Provided
    0


    0

    Default Re: Slow processing speed w/ large geodatabase

    I am using Version 10 and all the data is in vector form (admistrative boundaries, service area polygons, facility points, etc...).

    Jan

  4. #4
    Vince Angelo

    Join Date
    Feb 2010
    Posts
    2,727
    Points
    774
    Answers Provided
    110


    0

    Default Re: Slow processing speed w/ large geodatabase

    You have half a terabyte in*vector* data?! How big is the resulting file if you copy
    it to another directory?

    - V

  5. #5
    Jan Eberth
    Join Date
    Feb 2011
    Posts
    5
    Points
    0
    Answers Provided
    0


    0

    Default Re: Slow processing speed w/ large geodatabase

    Yes, I know it's big. I think part of the problem is I put all my data in 1 geodatabase instead of smaller ones. Included in the database are: 1) Facility Points (1 feature class per year, 8 years), 2) geographic boundary files for the whole U.S. (counties, states, block groups), 3) associated geographic centroids (block groups only), 4) Service Areas broken into 3 travel zones (0-10, 10-30, 30-60 min) for ~4,000 facilities, 5) tables with attributes for these facilities, and 6) various spatial joins of the previously listed feature classes (the biggest of which has about 1.5 million records, i.e., attributes of block groups that fall within these service areas). The database is so big I can't copy it to a local drive (not enough space). It's all on a network drive.

    And exporting some of the data to a different geodatabase is hard too, given the slowness (or crashes) I'm experiencing.

    - Jan

  6. #6
    Vince Angelo

    Join Date
    Feb 2010
    Posts
    2,727
    Points
    774
    Answers Provided
    110


    0

    Default Re: Slow processing speed w/ large geodatabase

    I just bought a pair of 7200RPM 1Tb SATA drives for $104 (each). Placing all your eggs in
    one network basket isn't very wise.

    And 600Gb still seems an order of magnitude too large for a vector dataset -- 150 DVDs?

    - V

  7. #7
    Ken Carrier
    Join Date
    Oct 2009
    Posts
    169
    Points
    28
    Answers Provided
    4


    0

    Default Re: Slow processing speed w/ large geodatabase

    Wow that is a lot of vector data.

    The performance could be related to multiple things

    1. Network speed
    2. Number of vertices in your data, the more vertices the longer it will take to render. You might consider generalizing some of the featureclasses individually in arcmap and see if that does not help improve performance
    3. Have you updated the spatial index on any of your data? When you edit data on a regular basis you need to perform this type of update so the indexes can take into account the new features since the last spatial index update was performed. You can create a python script to do this for you and loop through all your data rather than doing it manually, maybe run at the end of the week on Friday. For the amount of data you are talking about it could take almost all weekend to run.
    4. What are your machine specs? Is your OS 32 or 64 bit?
    5. Attribute indexes will also increase the size of a gdb, I was always told when it comes to indexing the more is not always better as it increases the size, so only use attribute indexing when it is needed. Do not index just for the sake of indexing.

    Something else you might try is creating a local file gdb, move what you consider to be your problem layers into the gdb on your local and see if you are still having performance issues. Possibly trying a few at a time might help you narrow down the issue. File gdb's have always been very fast for me even over a network, so by moving what you deem your problems layers local you might find if the performance is better that you need to contact your network admin to determine if it is in fact the network.

    For historical data that is based per year, I would create a file gdb for each year and begin migrating the data out of the larger gdb into those smaller file gdbs. Then delete the data out of your current gdb.

    Lot of options to choose from, problems like this can be a nightmare to narrow down. Best of luck.
    Last edited by carrierkh; 04-06-2012 at 09:16 AM.
    Ken Carrier, GISP
    GIS Specialist

  8. #8
    Jan Eberth
    Join Date
    Feb 2011
    Posts
    5
    Points
    0
    Answers Provided
    0


    0

    Default Re: Slow processing speed w/ large geodatabase

    Thanks Ken for the response.

    I'm not familiar with spatial indexes or updating them. I believe the problem resulted from a spatial join I conducted on my data. Each resulting feature class as about 200 GB. After deleting these 3 features classes, my dataset is down to about 25 GB total, much more reasonable.

    Thanks again,
    Jan E.

  9. #9
    Ken Carrier
    Join Date
    Oct 2009
    Posts
    169
    Points
    28
    Answers Provided
    4


    0

    Default Re: Slow processing speed w/ large geodatabase

    In arcmap or arccatalog use the search tool built into the applications and type spatial index in the search. You should see Add Spatial Index in the returned results.

    The other option is go to the geodatabase, right-click featureclass -> properties -> indexes tab, toward the bottom find Spatial Index window there should be 3 button Recalculate, Edit, Delete.

    Choose Recalculate, in most cases all locks need to be released before you can perform this operation so consider doing it when no one will be accessing the data.

    If you know python, you could write a script to loop through your entire gdb maybe once a week or once a month to perform this for you.
    Ken Carrier, GISP
    GIS Specialist

+ Reply to Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts