Hi there,
Generally, I think we err on the side of caution with stating specs for database size. I've seen AdminStudio catalogs grow to upwards of 9gb, with thousands of packages in the catalog, as well as thousands of Application Requests in the Workflow Manager part of the database, but this is not typical.
Your own database growth is going to vary according to your package catalog, and also depending on if you are doing Conflict Solving, and using Workflow Manager Application Requests.
I can't say I've tested the below method, but it's what I would expect to put you in the ballpark:
1. Pick 10 packages that represent an average set of packages you'll be working with. If they are not *.msi packages, Repackage them first.
2. Take each package and run them through a Streams extractor like Msix.exe:
http://blogs.msdn.com/b/heaths/archive/2006/04/07/571138.aspx3. Total up the size of the streams that were extracted from the *.msi package, and subtract this value from the size of the original *.msi file.
4. Multiply this value by 2. What this does is give you the total size of the binary database sans data that won't end up in the SQL database, and then doubling it to account for other records that could end up in the database and associated with that package (workflows, conflicts, etc.).
5. Repeat for each package, and average the number
6. Multiply by the number of total packages you are packaging.
Now, to speak to your question of Intensity of use, the database usage is fairly light with almost all tools, with the exception of performing a Conflict Check. Conflict checking is quite intensive, and can take quite a long time if the packages you are conflict checking are either numerous or large. The queries involved can end up performing joins against millions of records.
It's for this reason that some shops with a large number of packagers use catalog replication so that several conflict checks can run concurrently without effecting performance.
Hope this helps!