Some companies face the prospect of archiving electronic documents for very long periods of time – up to 100 years, for regulatory or business reasons. For example, construction companies involved in large projects, bridges, dams, skyscrapers, airports, etc., must keep all documents related to the project for the actual construction period plus 30, 50, 100 years (varies depending on state and local government regulations). As well, the records must be quickly searchable and readable over those same periods.
Most companies eventually realize many of their legacy applications or databases are costing them more while providing little value. The main reason companies cite for keeping aging applications and databases active is the potential need to respond to regulatory or legal requests. In reality, legacy application data can be archived and managed separately eliminating the need to keep legacy applications active.
With the recent ransomware attacks that have been in the headlines over the last year, many companies are reconsidering their data protection strategies to protect their company against these new, growing threats.
The Salesforce platform is a leading customer relationship management (CRM) application and cloud computing platform with functionality targeted at sales and marketing professionals. Salesforce offers a wide variety of CRM categories and systems to meet various customer needs including Sales Cloud, Marketing Cloud, Service Cloud, Analytics Cloud, Data Cloud, Community Cloud, App Cloud, and IoT.
With bipartisan support of the US., UK and major tech companies, new legislation enacted on March 23, 2018, replaces the outdated 1986 Stored Communications Act. The Cloud Act was forged out of necessity and fast tracked after a cross border conflict erupted when U.S. authorities sought a subpoena in NY for an Irish national’s emails stored in Ireland. Microsoft promptly filed suit against the United States and the Supreme Court is poised to make a decision in that case after oral argument earlier this year, yet the Justices implored Congress to replace the prior law to avoid a decision predicated on a law that predated cloud- based computing. Fueling the rush to put new laws in place is the fact that tech companies are incurring massive fines by complying with US law enforcement subpoenas that violate the privacy laws of other nations.
Prior to the availability of cloud archiving, companies were stuck with expensive, on premise archiving solutions mainly because they were the only game in town. The archiving software vendors focused on specific industries that required companies to archive data based on government regulatory requirements, for example the financial services industry with SEC and FINRA compliance requirements. Companies quickly discovered the downside of on premise archiving solutions; 1) they were expensive, and 2) they were complicated to maintain. Their main advantage was that your data was stored in your data center – you controlled your data.
The EU/US, Safe Harbor scheme, was struck down by the Court of Justice of the European Union (CJECU) in October of 2015 putting companies on both sides of the Atlantic in a difficult position - not having a process for legally transferring data out of the EU to the US.
How His New Machine Learning SW is Causing Big Headaches for the North Pole
AP Report--Dublin, Eire December 25, 2017; by James M. McCarthy, General Counsel
Having just rebounded from fallout arising from defending privacy claims involving its controversial practice of sending a special (and just a bit creepy) elf scout from the North Pole to EU homes to help Santa Claus manage his naughty and nice lists, NorthPole, Inc., is grappling with a new compliance problem…GDPR. Readers will recall that its stock (ST-NIK) took a hit on all exchanges following legal fees and penalties for violations of the EU’s Directive 95/46 and UK’s Data Protection Act, proscribing automated collection of data that occurred in the “Eric the Elf” debacle. 
In my frequent discussions with customers about the benefits of cloud archiving for regulatory, legal, abd business reasons, I still find a large percentage that still don't worry about archiving corporate social media content.
Microsoft today announced the general availability of their archive tier, Microsoft Azure Archive Blob Storage, to go along with their Hot and Cool storage tiers. For Azure-based archiving and information governance applications, the Azure Archive Blob Storage tier will be a huge advance for records managers and information governance professionals looking for long term, inexpensive archive storage.
Many companies are moving their email, file systems, data archives, basically all of their unstructured data to the cloud for cost savings, increased security, and ease of access. Finding the right data migration vendor with the correct capabilities and technology to ensure the migration goes off without a hitch is important to make sure the project adheres to deadline, avoids employee disruption, creates a full audit of the migration process, and ensures the data migration is legally defensible. Likely, the last thing you are thinking about is whether your migration vendor has duplicated your or your customer’s data for ulterior purposes.
According to IDC, healthcare data is one of the fastest growing segments of the digital universe – growing from 153 exabytes in 2013 to an estimated 2,314 exabytes in 2020, a 48% annual growth rate. So where will the healthcare industry put all of this critical and sensitive data and how long must it be held?
Today, companies are looking for solutions that can archive inactive data from little used enterprise applications. Those applications can be decommissioned, saving the company the expense of keeping them running for little payback. But the question not addressed early enough in the project is what to do with all of the application’s legacy data – delete it or save it (and where). By migrating the legacy data to an intelligent archive, organizations can preserve the value of legacy application data, ensure regulatory compliance, and address any legal concerns.
A couple of weeks ago I sat in on a presentation about the legal profession and information governance hosted by two attorneys. The presentation was very good however a couple of things set my teeth on edge. The two attorneys presenting started the discussion by pointing out that content is everywhere, is always changing, and is under increased scrutiny, and because of that can be a major headache for attorneys during discovery and for compliance personnel responding to information requests from government agencies. Hard to disagree with that.
MiFID II is right around the corner, January 2018, and there are new data handling, storage, and indexing requirements that some (or many) financial services organizations may not be aware of. In fact, MiFID II, focuses on the EU financial services sector and aims to improve the quality of advice presented to clients as well as offer additional investor protections. To accomplish these requirements, the new regulations add additional data recording, retention, and search requirements.
I am going to revisit a topic I have blogged about before, mostly because of the feedback I received at Microsoft Ignite last month (September) - that of records management versus information governance. To state the obvious up front; records management does not equal information governance and here is why.
The eDiscovery process can be a complex and expensive undertaking. Ever increasing data stores, new applications and data formats, country regulations limiting data movement and increasingly, documents authored in foreign languages, continue to drive up cost, time to respond, and risk.
One eDiscovery task that has been an ongoing pain for companies is dealing with foreign language-based documents during collection and review.
Corporations continue to adopt new information technologies that make their jobs both easier and more complex. Companies have adopted new communications platforms like Skype for instant messaging, enterprise social networks like Yammer and Slack, collaborative groupware applications such as WebEx, GoToMeeting, and video conferencing, not to mention audio and video recording for security. And of course most companies still rely on the old tried-and-true tools like email and telephone/voice messages for day to day communications. Many of these tools now allow you to record both audio and video for regulatory and eDiscovery needs.
The concept of Defensible Disposition has been around for many years. Defensible Disposition is the process of disposing of unneeded and valueless information in a manner that provides information about the disposition process showing that deleted data was not under regulatory retention requirements and the data was not subject to current or anticipated eDiscovery. In short, a data disposition process that ensures regulatory and legal considerations are taken into account.
There are many reasons to develop and follow information management policies including the retention/disposition of information. The most obvious reason is to ensure compliance with regulatory retention requirements. Another reason is because of business requirements such as ensuring that data not deemed having long term value is disposed of so that IT resources are not consumed with "junk" data.
I continue to hear companies make the case for the need to have relatively detailed retention/disposition policies is due to their belief that "the law" requires it - in case your company is involved in a lawsuit and eDiscovery. Let me first touch on the first two reasons before I get into the main reason for this blog.