The adage “too big to fail” relates to financial institutions so large and interconnected that their failure would have seismic repercussions in the economy, but what about these same companies’ plans to manage big data? The new question is whether there is a data archive “too big to move?”
The quandary for a CTO that implemented an archiving system 10+ years ago and accumulated terabytes of data on what is now “old technology” is what should you do to insulate your company from rising costs and risks of losing data? How do you keep your data systems current? Are you resigned to throwing duct tape and prayers at your current legacy archive believing its simply too big to move? We have heard this lament too often and know that it is just not so!
By way of example, one of our customers in the financial sector was concerned about an apparent end of life scenario with its on premise EMC Centera storage system and its extremely old Alta Vista indexing software. This financial sector customer is currently storing tens of thousands of discrete archive data sets that continues to grow on daily basis. The customer has regulatory compliance concerns with FINRA and SOX Act obligations. However, like many companies relying on legacy technology, its archive software vendor has been acquired at least three times causing concerns about ongoing support and bug fixes. For this company, taking no action was not an option – with the belief that the vendor would soon stop supporting the archive altogether. This customer found the perfect solution with Microsoft’s Azure platform. Now, the customer’s ongoing data streams (we call them “cabinets”) involving trading applications, trade confirmations, and trade blotters are all safely captured and stored in the Azure cloud. Now, the customer no longer believes it is “locked in” with no safe way out.
The best answer for your company may also lie in moving from a proprietary, on premise software model with a vendor who has a lock on your data, towards a solution utilizing an open standard archive in the cloud. Avoid hardware upgrades, increased maintenance costs, and, most importantly, avoid tying your company’s fortunes to the viability of a proprietary software vendor. Take your data back…because your aging legacy archive is not too big to move.
There is a Way
Archive2Azure is Archive360’s Compliance and grey data cloud solution targeting long term storage and management of unstructured grey data into the Microsoft Azure platform. The Archive2Azure solution leverages Microsoft Azure’s low-cost ‘cool/cold’ storage as an alternative to expensive on premise enterprise storage. Azure costs as little as $0.02 per GB per month and eliminates all the expensive overhead costs of traditional on premise storage.
Archive2Azure importantly provides automated retention, indexing on demand, encryption, search, review, and production – all important components of a low cost, searchable storage solution. Given the clear cost advantages of the Microsoft Azure cloud, it’s no surprise many companies are looking to Microsoft Azure and Archive2Azure for grey data management and storage.
Archive360 is the market leader in email archive migration software, successfully migrating more than 12 petabytes of data for more than 500 organizations worldwide since 2012. The company’s flagship product, Archive2Anywhere™, is the only solution in the market purpose-built to deliver consistently fast, trouble-free, predictable archive migrations, with verifiable data fidelity and defensible chain of custody reporting. Archive360’s newly released Archive2Azure solution is the industry’s first regulatory compliance and grey data storage solution based on the Microsoft Azure platform. Archive360 is a global organization that delivers its solutions both directly and through a world-wide network of partners. Archive360 is a Microsoft Cloud Solution Provider and the Archive2Azure solution is Microsoft Azure Certified.