A fellow colleague of mine was recently involved in a migration effort to move documents from a custom application into Documentum. While this custom application did not have a folder structure, Documentum does. The client was insistent of not wanting to spend any time or effort in designing a rudimentary folder (taxonomy) structure to place the migrated documents. Instead, the client wanted to import over 50,000 documents into a single folder. They insisted that the folder structure was unnecessary and that searching would solve all of their needs, as was the case in the old custom application. Is this the right thing to do?
I disagree with this approach if there is some metadata available with the migrated content. Most content management applications usually have metadata like creation date, document type, or department. This kind of metadata can be used to aggregate the migrated content into at least 10-20 folders. This can potentially reduce the number of documents per single folder from 50,000 to maybe 2,000-3,000.
Why is this important? Obviously, you still would not want a user to browse through a few thousand documents in a folder. The benefit of aggregating content into smaller buckets is the ability to process documents in batches. This may not be a current requirement, but it does support future needs.
Imagine having migrated content segregated into yearly/monthly subfolders. This would facilitate archiving of content based on creation month. It would also lend itself to applying retention policy and disposition dates for different groups of documents.
If there is no metadata available with the migrated content, then it does not make any sense to expend time and effort to try to impose a folder structure/taxonomy. So now you know where to put it.
Pingback: Build that Taxonomy…Why? Because I Said So « Word of Pie
i disagree.content in single folder can also be archived and RPS can also be applied based on meta data.
however i agree in most cases it wud be unintuitive to have everything in one single folder.
I didnt mean to imply that having everything in folder would prohibit the archiving of content. Although, if I remembered correctly, its more efficient to place a retention policy on a folder than on individual documents. Having folders will give you the flexibility to apply multiply policies on different folders.
Hi Johnny, how are you doing? long time no see..
Yes, I agree with you on putting docs in different folders for easy management and classification purposes. It’s one of the basic best practices in the real world. As for taxonomy classification, it should be implemented on attribute levels, not by folders.
Just my two cents.
I took part in a similar project.
All the documents were in a hidden cabinet and users were accessing them through predefined smartlists only (thus only the relevant documents were shown).
All cabinets were hidden, except the user’s home and the one containing the smartlists.
Two people of that group were given the rights and training the create/update smartlists. We gave advice and help on tuning smartlist when necessary but they were quite autonomous.
There was much more that 50k documents, nonetheless, it worked fine.
Thanks for your input Marc. You have described a good scenario and mechanism that works fine for your requirements.
I would think that 50k un-differentiated documents means that the customer doesn’t really care about them. At the same time, if they are to be easily accessible, navigation of 50k things means that:
– users have to find their way around
– the UI must be responsive to such an environment. Unbound queries for all those documents would take a long time to post and would only get worse over time. Not a recipe for success.
Nice case study Johnny.
Migration Projects are always a challenge and good to work on.
I have shared one of my experiences in by blog http://ashishhere.wordpress.com.
Please have a look and let me know your thoughts