A fellow colleague of mine was recently involved in a migration effort to move documents from a custom application into Documentum. While this custom application did not have a folder structure, Documentum does. The client was insistent of not wanting to spend any time or effort in designing a rudimentary folder (taxonomy) structure to place the migrated documents. Instead, the client wanted to import over 50,000 documents into a single folder. They insisted that the folder structure was unnecessary and that searching would solve all of their needs, as was the case in the old custom application. Is this the right thing to do?
I disagree with this approach if there is some metadata available with the migrated content. Most content management applications usually have metadata like creation date, document type, or department. This kind of metadata can be used to aggregate the migrated content into at least 10-20 folders. This can potentially reduce the number of documents per single folder from 50,000 to maybe 2,000-3,000.
Why is this important? Obviously, you still would not want a user to browse through a few thousand documents in a folder. The benefit of aggregating content into smaller buckets is the ability to process documents in batches. This may not be a current requirement, but it does support future needs.
Imagine having migrated content segregated into yearly/monthly subfolders. This would facilitate archiving of content based on creation month. It would also lend itself to applying retention policy and disposition dates for different groups of documents.
If there is no metadata available with the migrated content, then it does not make any sense to expend time and effort to try to impose a folder structure/taxonomy. So now you know where to put it.