What do Marie Kondo, Zen, and enterprise software have to do with each other?
Everything.
In the opening sequence of Marie Kondo’s Netflix series ,“Tidying Up with Marie Kondo,” she exclaims, “I’m so excited because I love mess. It’s a never-ending battle to fight the clutter.”
As a former database administrator, I can’t ever say that “I love mess,” but she does have a good point, as more data from different data sources came in, it became a “never-ending battle to fight the clutter.” Like KonMari, the moniker Marie Kondo is known by, we empathetically listened to our users, weeded through the data, and delivered to the users, the data that they absolutely could not live without, data that would be the base of their executive decisions.
Unlike the show and her methods, the one thing we rarely did is throw away the data, but that is the nature of the databases: We hoard data. The key for us was to figure out what to do with the clutter, make sense of it, ensure safe delivery, then do this all over again with every new request from the business side. Our goal was to keep a clean, consistent, and simplified process in place, but frequently as time passed, data management became increasingly complex and challenging. It became inevitable and cyclical that we would have to “fight the clutter.”
Sound familiar?
The Problems With Current Database Management Projects
Every time you have any kind of “database” project – whether it’s evaluating a new database, migrating across on-premise, cloud or even hybrid systems, or dealing with legacy systems – you must consider: How much time and money is sunk into the project? How many resources are getting involved? How much time is it taking away from your everyday revenue growth activities?
Even with data security projects, how are you navigating the complex landscape of blockchains or distributed ledgers? How do you know whether your digital assets are secure, clean, and consistent across your systems? Ultimately, is the cost of implementing these systems adding value to your business?
In reality, database projects are almost always a messy endeavor. Not only do you have to consider all the above, but you’re also juggling with multiple stakeholders that threaten to complicate scope and delay timelines – an administrator’s nightmare in our quest to keep everything clean, up-to-date, and consistent.
How a Zen Approach Can Help Clear Out Data Clutter
There’s a famous Japanese monk, Shunryu Suzuki, who popularized Zen in the United States who once said, “You should have a general house cleaning of your mind. You must take everything out of your room and clean it thoroughly. If it is necessary, you may bring everything back in again. You may want many things, so one by one you can bring them back in. But if they are not necessary, there is no need to keep them.”
How does this apply to enterprise software? (After all, we’re not talking about the rogue missing sock or the mystery screws in my junk drawer that I insist on keeping … just in case.)
As somebody who has worked on both the technical and business sides, we frequently practice this “house cleaning” when we would update or migrate databases, connect data sources to new applications, or even upgrade our hardware. Through the terabytes and petabytes of information we try to consolidate, clean, manage, and move things “back in” so that we could get everything in order. Like KonMari says, “Life truly begins only after you have put your house in order,” and data only becomes useful when it is in its proper form and place.
Considering the Long-term in the Near-term
But like life itself, fully incorporating the philosophies of KonMari and Zen within data management requires consistency, discipline, ability to scale (which, in life, can be thought of as personal growth), and the right processes and tools in place. Otherwise, we’re stuck in an eternal loop of house cleaning back to messy, back to house cleaning, then messy again – ending up in a long-term downward spiral toward an unmanageable process.
Specifically, there are now services and technologies that ensure the data presented to your users is consistent, clean and secure, and flows directly through regardless of the type of underlying database or databases, reducing data clutter. They also eliminate the need to spend time testing your applications even when you replace or migrate your databases or work across disparate on-premise or cloud-based systems, making the whole project a whole lot easier and less messy.
In short, what was once a never-ending cycle of fighting the clutter and “house cleaning” can now be broken. With the right approach, it is possible to truly simplify data management, ensuring data reliability and security and, ultimately, drive business value.
About the Author
As the CEO and President of Scalar Labs, Joe McCunney leads software solutions that simplify complex data challenges across the enterprise, including ScalarDB (database management) and ScalarDL (distributed ledger). With over 20 years of technology experience and a deep understanding of database administration, software development and project management, Joe has helped companies across the globe enhance their data management processes.
Sign up for the free insideAI News newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideAI NewsNOW