Migrating Business Data to the Cloud

Migrating Business Data to the Cloud

When businesses elect to have their desktop applications hosted in the cloud with a hosting service provider, they are also electing to have their data hosted with the provider.  This point is not always obvious to non-technical users and those unfamiliar with the hosted application concept.  Many business owners have adopted an online or hosted application solution and then realized after-the-fact that their data was no longer present on their computer.  At least, no current data was present, and it was quite a surprise the day they wanted some information but could not get it because they were not connected to the Internet at the time.  An important thing to remember, and the essential factor in measuring risk associated with use of cloud services and hosted solutions, is that adopting online applications in almost any form means that the data associated with (and possibly even data remotely associated with) the application will also migrate to the cloud.

mobile cloud data

Migrating on-premises servers – and the applications and data residing on them – to the cloud makes sense for many businesses.  Particularly as network and internet threats increase in number and as system vulnerabilities are more frequently introduced with remote and mobile access technologies, cloud solutions can significantly assist a business in mitigating the risks of being connected.  Yet business owners and IT managers must be diligent in terms of understanding the measures their service providers take to protect and preserve as confidential the customer’s business data.  And it becomes more than essential that any and all tools or services implemented be part of a strictly controlled information management and data protection plan.

Where applications are simply interfaces and logic; the value for a business is in the data used by the applications – data containing information about the company, how and with whom it does business, and how it makes money.  It is critical that the business consider how and where users need access to applications and data, so that any cloud deployment does not wind up hindering productivity rather than facilitating it to a greater level.  It is when the user becomes disenfranchised, unable to perform their work due to lack of access to information or tools, that “shadow IT” deployments appear, and data sharing solutions are introduced outside of the governance of management or IT.

The vast number of offerings for hosting applications and managing business data in the cloud makes finding and implementing the right business solutions a complicated and often frustrating process.  Even large providers that specialize in delivering from a menu of business cloud solutions often forget that their target customers may not be particularly tech-savvy, and will fail to recognize the nuances in service delivery or protection that could make big differences to the business down the line – like in the case of a system failure or outage.

Among the keys to a successful cloud solution deployment, particularly when critical and frequently used applications and data are to be migrated off-premises, is a thorough understanding of how users currently work with the tools provided, ensuring that processes and utilization can be fully adapted to the new IT model.

As long as users are able to retain their productivity and efficiency, and when improvements in workflows and information access become additional benefits, the security and protection of the business data is more likely, as users will feel less compelled to find alternative and less secure means for making the business data available from the cloud.  You may want to migrate your business data to the cloud, but you don’t want your data to migrate further than you can reach.

Joanie Mann Bunny FeetMake Sense?

J

Security and Users: Change is the Only Constant

Security and Users: Change is the Only Constant

Managing user accounts and access to business IT assets is challenging, particularly as cloud and social computing models introduce new wrinkles in security and identity management. Information has become “mobile” along with the users accessing it, yet management of user behavior is even more complicated that trying to manage a digital resource.

If you look at the history of security breaches, you’ll find that many of them started with a user making a mistake – like losing a laptop or clicking on a phishing email, downloading bad software, or forgetting to report an employee termination to the IT dept – something which inadvertently created a vulnerability that could be exploited.  It’s tough to stop breaches because there are so many possible ways for them to happen.

If most security breaches start with a user mistake, then IT departments have their hands full because users aren’t static, unchanging objects to monitor and manage.  Users change, sometimes a lot.  It is this constant change which undermines the ability for some IT departments to meet the demand to adequately secure company information systems and data. Now is the time to take control of user security and identity management, creating automation and controls to protect business assets in a constantly evolving environment.

It is not simply employee turnover that challenges security management.  Certainly, IT departments have been dealing with user account creation and termination for a long time.  And sure, users have sometimes been promoted and demoted, resulting in the requirement for IT to increase or perhaps decrease access to information and applications.  These are normal and expected activities for a business IT department.  Unfortunately, IT often doesn’t hear about the user’s change in status.  An account isn’t disabled, access isn’t restricted, and the system is left vulnerable.

Just to pile on, think about what happens when a user is more than just a single system user.  It may be manageable when where a single identity and set of credentials governs their access to applications and information.  But the proliferation of web-based services and SaaS solutions has made it commonplace for users to have multiple applications and services available to them, each with their own approaches to identity management.

For even a small business IT department, the security of all of these access points and applications must be managed and monitored – no small task when the department may not even be aware that the solution is in use.  It is not unusual for file sharing, data sync, or other applications to be implemented in businesses without the knowledge or participation of the IT department.  Actually, many services attract users due to their simplicity and ease of use, leveraging the fact that they can be deployed without the “assistance” of IT.

Users are becoming increasingly mobile, accessing information and applications from public and private locations while using any number of possible mobile devices.  Vulnerabilities which may exist in public networks and the increased potential for device loss or theft are high on the list of concerns of IT departments managing remote and mobile user access.  Mobility is driving many changes in how information technology and access to systems is provided to users, and it is changing user demands for what they should be able to easily accomplish while being mobile.

Businesses need to recognize that their continued existence may rely on keeping their information systems and assets safe and secure.  Disaster recovery and business continuity applies not only to loss of physical systems, but also to losses of various forms due to data breach. The disaster recovery and continuity plan (you have one, right?) should not only address situations after they happen; planning by definition is proactive.  It is not enough to have a plan to recover from loss or failure; the business must actively engage in activities which will prevent loss and reduce vulnerability. 

Part of this plan necessarily centers on managing users and user identities, ensuring that the company knows about all access or user accounts involved and employs strict processes and guidelines for making sure they are constantly up to date and have the authority to do what they’re trying to do.  In short, the plan must also be a plan for change, providing change management processes to guide the business as the evolution of information technology and the dynamics of user interaction continue to change.

jmbunnyfeetMake Sense?

J

read more about IT Security and Engaging users to reduce vulnerability

read more about Mobility and the Cloud, Managing BYOD and securing company resources

The Line in the Sand: Your RPO (Recovery Point Objective)

The Line in the Sand: Your RPO (Recovery Point Objective)

IMG_0108Businesses and individuals are increasingly more dependent upon the technology supporting their various activities, and the volume and velocity of information moving through these systems is increasing at astonishing rates.  With the growing reliance on information technology and electronic business data, you’d think that more businesses were paying close attention to protecting these assets. I recognize that there is a broad understanding of responsibilities as they pertain to system security, and businesses of all sizes and types are increasing their awareness of the variety of threats facing their systems and are taking steps to address them.  Yet there remains an aspect of business data protection that too few businesses are really zeroing in on, and that is the time and complexity of recovering or restoring business data in the event of an outage or loss – and the absolute line drawn in the sand which says that “here” is the tolerable loss we can experience: no more and no less.

This line in the sand is referred to as the RPO, or Recovery Point Objective. A recovery point objective is part of the business continuity plan (or should be!), and describes the maximum tolerable period of time for which data might be lost from a major IT service incident.  The necessity to establish this time frame – the RPO – exists whether the business is small or large.  In fact, small businesses have data protection needs quite similar to their enterprise counterparts.  In an article in SmallBusinessComputing.com, Kieran Maloney of Quantum Corporation is quoted as saying that “from a data protection standpoint, smaller businesses face challenges that are similar to those of larger enterprises; the amount, and the value, of their data is growing significantly while their budgets are not”.

What doesn’t seem to make sense is that businesses continue to view data backup as a necessary evil rather than a strategic element, and spending considerations for creating and meeting a realistic RPO remain low.  An article in TheStreet.com on the subject quotes Terry Cunningham, president and manager of EVault, saying “When largely preventable data loss conservatively costs businesses hundreds of millions of dollars annually, it is time to rethink your priorities”.  The author also writes that “while 95 percent of US IT decision makers said they have some type of disaster recovery plan in place, only 44 percent have remote, cloud-based recovery capabilities… More than twenty percent of IT organizations that manage between 2-7 TB of data suffered a data loss in the past year – in fact, more than half of this group suffered 2-3 data losses – each with an estimated average cost of 2-5 percent of total company revenues”.

Part of the continuity plan and a consideration in developing an approach which will meet the RPO timeframe should be the implementation of remote cloud based service, yet this has remained a low priority for many business owners.  Reliance upon more traditional data protection approaches, including tape backups and on-premises HDD solutions provides IT managers with a false sense of security and often cannot even reasonably address recovery from data loss due to hardware outages, much less for potentially catastrophic failures including loss of the location.

When considering the RPO – the minimum acceptable point for data recovery (or maximum tolerable point for loss) – businesses must look at their data management and backup strategies in order to address recovery approaches for various types of outages.  There are benefits and drawbacks associated with the different methods of backing up data, and the cost/benefit of employing any solution must factor in to the requirement to meet the stated RPO.  Daily backups may be the standard procedure, but is a potential loss of 24 hours of data acceptable to the business?  On the other hand, what is the potential cost of re-creating the data, if it can even be recreated?  Consider also that the timeframe for data recovery is not the point at which the last backup was completed; it is the point when the last backup was started.  This could result in a loss window greater than the established 24-hour boundary.

Many businesses would suggest that their tolerance for lost data – due to the cost of lost productivity and order activities – is far less than 24 hours, yet solutions employed to reduce the potential data losses often do not fully address the issue in any comprehensive manner.   IT personnel working with separate products to handle incremental data backups, machine recovery (bare metal) and snapshots of disk arrays often have a tough time trying to piece together the various pieces of the puzzle and often simply hope for the best in terms of outcome.

The prudent move is to thoroughly consider the business disaster recovery and continuity plan, and establish the boundaries for tolerable loss.  No business wants to expect to lose valuable data assets, but expecting technology to perform flawlessly is unrealistic, not to mention the unexpected impacts from acts of nature or other forces majeure.  Architecting systems to withstand service outages and having a comprehensive plan for recovering from system outages in a timeframe survivable by the business is the essential element to making a continuity plan worthwhile.  Draw the line in the sand, and then develop the system protection and recovery plan that will help make sure you never have to step over it.

Make Sense?

J

Here are a few data loss statistics for your reading pleasure… Enjoy  🙂

(stats drawn from summary on BostonComputing.net.  They may be a bit dated, but the numbers have only increased since then.) http://www.bostoncomputing.net/consultation/databackup/statistics/

The following statistics were gathered from various sources:

  • 6% of all PCs will suffer an episode of data loss in any given year. Given the number of PCs used in US businesses in 1998, that translates to approximately 4.6 million data loss episodes. At a conservative estimate, data loss cost US businesses $11.8 billion in 1998. (The Cost Of Lost Data, David M. Smith)
  • 30% of all businesses that have a major fire go out of business within a year. 70% fail within five years. (Home Office Computing Magazine)
  • 31% of PC users have lost all of their files due to events beyond their control.
  • 34% of companies fail to test their tape backups, and of those that do, 77% have found tape back-up failures.
  • 60% of companies that lose their data will shut down within 6 months of the disaster.
  • 93% of companies that lost their data center for 10 days or more due to a disaster filed for bankruptcy within one year of the disaster. 50% of businesses that found themselves without data management for this same time period filed for bankruptcy immediately. (National Archives & Records Administration in Washington)
  • American business lost more than $7.6 billion as a result of viruses during first six months of 1999. (Research by Computer Economics)
  • Companies that aren’t able to resume operations within ten days (of a disaster hit) are not likely to survive. (Strategic Research Institute)
  • Every week 140,000 hard drives crash in the United States. (Mozy Online Backup)
  • Simple drive recovery can cost upwards of $7,500 and success is not guaranteed

 

IT Security and Engaging Users to Reduce Vulnerability

IT Security and Engaging Users to Reduce Vulnerability

There is a lot of discussion going on about security in the cloud.  With numerous advancements in technologies of various sorts intended to secure our information and identities on the Web, how is it that security continues to be a growing problem?  The answer is in the Big Data the Web collects (read about the Internet of Things – IoT), the large silos of data now handily available in the cloud, and users who continue to provide access for all sorts of bad guys and malicious attackers simply due to not understanding that they – the users – remain as the biggest vulnerability of all.  It is educating this user and finding a way to get them to recognize their potential as a critical element in enhancing system security and reducing vulnerability that has become the larger challenge.

People are nothing more than another operating system, says Lance Spitzner, training director for the Securing The Human Program at SANS Institute.  “Computers store, process and transfer information, and people store, process and transfer information,”  How Hackers Fool Your Employees

Social engineering and finding ways to earn user trust has become a widely recognized means for gaining access to systems and information.  Any experienced computer security consultant recognizes that Microsoft Outlook is among the best applications to place in front of users to test system security, as emails with malicious attachments (spearphishing) represent a majority of targeted attacks.  And hackers aren’t resting on their laurels while users figure out that opening email from unfamiliar sources isn’t a good idea.  Nope, not for a minute.  Today’s flavor is “conversational” phishing, where it is made to appear as though a real person is at the other end of the conversation.  Hackers are patient, and they are willing to take the time to find a way in.  Users, on the other hand, still tend to be somewhat complacent when it comes to security, and often operate under the belief that the IT security products and the IT department have it all under control.   And no matter how many times they’re told to not click on strange email attachments, to change passwords frequently, not to reuse passwords, and to make passwords hard to guess… getting users to comply continues to challenge system administrators.

most-valuable-security-practices

Communicating with users about the importance of adhering to password management and other security standards often falls on deaf ears for two reasons:  users believe that system security is the job of the IT department, and users are made to feel stupid by being chastised and punished by the IT department that’s supposed to be helping them.   Rather than helping to educate users and find innovative ways to get users to participate in helping to improve system security, IT administrators and security teams generally view users as part of the problem rather than part of the system of solving it.

It’s a heated debate that can upset people on opposing sides.  For instance, one RSA conference presenter conducted a class on “how to patch stupidity,” Spitzner says.  “He explained why people are stupid, how they’re stupid and how to fix stupid.  It was a very emotional talk for me, because how can you sit there and insult the very people who can end up helping us?…  How Hackers Fool Your Employees

In order to build strong security which is better-suited to protect businesses from today’s variety of threats, IT security professionals and system administrators should engage in positive internal marketing for better system security, deliver improved education to build awareness with users, and actually engage users in the process of threat identification and detection.  These users don’t have to be geeks or IT people; they can be average users who simply keep their eyes open to things that just don’t seem right.  “People can become a detection system to improve organizational resilience.”

jmbunnyfeetMake Sense?

J

Cloud Computing and Online Accounting for All? Some Markets Are Still Waiting for Broadband

Cloud Computing and Online Accounting for All? Some Markets Are Still Waiting on Broadband

As the information technology industry espouses the benefits of the “paradigm shift” in computing and the move to cloud computing platforms and models, there are folks out there in the world who just aren’t seeing it happen like that.  Not everybody’s working online. For many, the Internet and online working models simply haven’t intruded into their lives and businesses as it has for others.  While this may be partially rooted in conservative mentalities and beliefs which are resistant to change, the more likely reality is that options for high-quality and affordable broadband service is simply not available to them.  Without choices for affordable and useful connectivity to the Internet, online just doesn’t have the attraction it does for those who are “connected”.

When businesses look at cloud solutions and the Internet dependency that comes along with them, having more than one connection to the outside world becomes the imperative rather than a luxury.  Unfortunately, some markets are still waiting for broadband (or have very limited options for service), rendering the cloud nearly unreachable.

It may come as a surprise to some, particularly to those in East and West coastal regions, that high speed broadband just isn’t as available in other zones.  In fact, the *National Broadband Map clearly reveals limited availability and choice in numerous regions of the US.  Broadband Internet access is a necessity to support the IT industry’s shift from localized IT to “cloud” IT.  But the shift is only evident to those who are involved in it or who have that option.  For those who the industry is beginning to refer to as the technology “have-nots”, this lack of available and affordable access will ultimately create more than simply an inability to participate in broadband-reliant IT solutions.  The fast pace of innovation and evolution in IT almost guarantees that the technology have-nots will fall even further behind, possibly to the point of not being able to catch up.

 “A Growing Gap Between IT Haves, Have-Nots. There will be a growing gap between the IT haves and have-nots in 2013. The latter will fall behind the former on a wide range of business technology fronts such as mobile, cloud, social, virtualization, and analytics…” 7 SMB Technology Predictions for 2013 | InformationWeek.com

As business (and personal) technology models continue to evolve, and as new solutions and services begin to displace the old, those who remain disconnected will begin to directly experience much more impact.

Consider something as simple as using QuickBooks desktop software for small business bookkeeping.  As Intuit continues to remove elements from the installed software product, turning them into web services instead, customers with limited or no broadband access will find themselves without the features and functionality they need in the software.  And the only possibly comparable alternatives to QuickBooks desktop accounting products are Internet-based alternatives, making them not really alternative options at all.

It is also likely that lack of sufficient broadband is one of the factors motivating many solution providers to seek clients in other markets – outside of the United States, and in regions where broadband availability is more prevalent and service speed and quality is higher.  Yes, it’s true.  The United States is not the leader in broadband availability, or even in quality.

“For many people, their broadband connections are their lifelines. So what is the state of broadband in the U.S.? Well, when it comes to speed and price and adoption, we’re certainly not a leader — “middling” is a better way to describe our position.

Currently 119 million people that live in the U.S. don’t have broadband connections (for many reasons, including not wanting it or not being able to afford it) while 19 million don’t even have the option to get it. Our rate of broadband adoption (62 percent) lags behind countries such as South Korea, the U.K.,and Germany, according this year’s Federal Communication Commission report. (We’re closer to the penetration rates to Japan, Finland, and Canada.) These numbers are not likely to change soon, given that broadband growth is slowing and providers are moving away from wireline infrastructure. “ GIGAOM:The state of broadband in the U.S. [infographic]

Accountants and other professional service providers serving clients in regions lacking sufficient choices for access must recognize that their approaches to doing business will not necessarily match their peers in more fully connected areas.  Certainly, accounting and legal professionals are dealing with this reality as practice coaches and industry leaders push for IT- and cloud-enabled models for improving practice performance and creating differentiation, even as their proven applications and business solutions morph into or are replaced with SaaS applications and online service.

The take away from this is that there are still large numbers of businesses and individuals doing things with legacy tools, managing spreadsheets on standalone PCs, or writing with pens and using paper – even in areas where broadband access is plentiful.  Regardless of how forward moving the rest of the world may be there remains a need to provide service and support these IT have-nots.  Perhaps this becomes a means for differentiation, finding ways to work with businesses who are connected and those who are not, and leveraging the firm’s access and capability to deliver what the client cannot obtain directly.

Make Sense?

J

*The National Broadband Map is a tool to search, analyze and map broadband availability across the United States

Moving Your Systems to the Cloud

The IT industry is promoting Software as a Service and online applications as the new normal for computing, and unless you’ve been living under a rock for the past few years you have heard how it is supposed to make our computing lives ever so much better.  Hiding under that rock might also have spared you from reading about the various failures and outages which impact users, forcing them to make do without the online applications and data they have become so reliant upon.  It’s surprising, but not unimaginable, that businesses rely so heavily on applications and services that didn’t even exist a few short years ago.

The potential benefits of a SaaS model are many, but the risks are equally significant and should not be minimized.  This assessment should center on a review of the application software in use, considering whether or not it is meeting the needs of the business.  Where and how the software runs is much less of an issue than the functionality and process support it provides – most “legacy” applications can be run in a cloud server environment, making remote access and managed service part of the service model.

There is risk in changing business applications – risk of data loss, changed or broken data relationships, lost productivity, and more.  Many businesses would benefit by running their applications in a cloud model while continuing to utilize the software solutions their operation relies on.

Application hosting models where desktop applications are delivered on cloud servers is  often overlooked when businesses go looking for cloud software because they are shopping for software and not the platform.

With Software as a Service (SaaS), the software and the platform are combined and together represent the solution. With application hosting on a cloud server, the software is the same software a business would traditionally run on PCs and servers, but the they are installed and managed on the cloud server rather than the local computers.

The big benefit is the agility of the platform and the user mobility it allows.  The unspoken benefit is that you can still “take your ball and go home” if the service doesn’t work out.

Removing the barriers for adopting an online working model allows the business to experience the benefits attached to cloud computing without introducing unnecessary risk through unneeded changes in software and applications.

Make sense?

J