Cloud Computing Evolved: Disruptive Technology Goes Mainstream

jmbunnyfeetCloud Computing Evolved: Disruptive Technology Goes Mainstream

A 2010 information technology report by IDC (International Data Corporation, a global provider of market intelligence) provided a few interesting predictions for Information Technology in these changing times.  Not surprisingly, many of the predictions centered around the same “ingredients of IT industry transformation” which were identified in past years as being disruptive technologies including cloud computing, mobile devices and applications, wireless broadband, virtualized infrastructure, social networking, and smart devices being among those listed.  The subsequent 2011 report suggested a continuing trend of spending and innovation in cloud technologies and mobile computing and in collecting and analyzing the huge volumes of data being generated.  It is clear that cloud computing is evolving from being disruptive technology to mainstream IT.

Everyone must by now recognize the significant growth in use of online and mobile applications and services.  If you haven’t noticed that just about everyone has a smart phone or tablet computer, then you’ve got your head buried deep in the sand.  What this clearly indicates, and IDC supported the position with quantifiable evidence, is that the “disruptive technologies” of yesterday have transitioned from early adoption to mainstream adoption.   This means that use of these technologies had pushed “well beyond” the first 10 to 15% of the market through 2010, and that customers were ready to integrate these new solutions as core parts of their overall IT strategy.

If you don’t believe that cloud computing, virtualization, and mobile access are becoming (have become?) mainstream, consider the staggering number and variety of mobile devices and networks available today.  The adoption of these devices is driven by the availability of broadband wireless service, and their use is fueled by applications offering “social business” and “pervasive analytics”.  No longer limited as a voice communications device, the mobile phone has now become the mobile workstation, capable of supporting a wide variety of business and personal interactions and functions intended to help users generate and analyze “unprecedented volumes of information” – and the 2011 report indicates that mobile computing is continuing to fuel the trend.

“Mobility wins” will be the top theme of the year as mobile devices outship PCs by more than 2 to 1 and generate more revenue than PCs for the first time. 85 billion mobile apps will be downloaded, and mobile data network spending will exceed fixed data network spending for the first time.

IDC’s 2010 report placed an interesting focus on the impact of this new era of IT, believing that it would be a launchpad for  the creation of “intelligent industry” with an IT-enabled “intelligent economy”.  This doesn’t apply only to those very large multinational corporations, like the IBM commercials about a smarter planet and the commercials where the box tells us where it (and the delivery truck) is.  This new-found intelligence would allow businesses of all sizes to offer better and more customized services locally while dramatically expanding their market reach beyond geographic boundaries, and positioning themselves for accelerated growth.

As the number of intelligent communicating devices on the network will outnumber “traditional computing” devices by almost 2 to 1, the way people think about interacting with each other, and with devices on the network, will change. Look for the use of social networking to follow not just people but smart things.”

Business owners who find a way to leverage this new capability through innovative applications of cloud computing and mobile device access will almost certainly find that their businesses are better suited to addressing the needs of their current market, but are also poised to take advantage of emerging opportunities in emerging markets as well.

In 2010 IDC predicted that by 2012 we would begin to see the “slow death” of cloud computing – the term, not the technology model.  Even though cloud computing is one of the hottest buzzwords in tech today, the model is becoming mainstream to the point where it is no longer considered a bleeding-edge method of computing requiring its own descriptive name. While IDC may have been a bit off in terms of forecasting the slow death of “cloud” terminology, their finding that the evolution of cloud computing models is rapidly progressing from disruptive to mainstream appears to be spot on.

Joanie Mann Bunny FeetMake Sense?
J

updated from original post in 2010

Many Companies Are Negligent About SAP Security, Researchers Say – CIO.com

Is your hosting service provider helping to keep your critical business applications secure?  It is not enough to simply harden machine images and develop policy-driven access; application hosting providers need to understand the vulnerabilities introduced by each and every application in the environment.  Otherwise, the system could be exposed to threats directed specifically at the application environment and opportunities it presents.

Many hosting providers will offer customers service for any business application they have, and often provide those services with no significant experience or expertise in dealing with configuration or security issues specific to those applications or environments.  Consider the following report from IDC which indicates that numerous SAP deployments remain vulnerable to attack or intrusion, even though SAP has improved security of the products. The problem rests not exclusively with the SAP applications, but also with the approach to implementation of systems and security around those applications.  Understanding the various vulnerabilities introduced with SAP products is the first step to securing them.  Certainly a skilled IT solution provider is likely to offer a high level of service and capability, but there may be issues presented by various products (like SAP) which introduce additional or unique considerations, and it is important for the service provider to be aware of and address them.

Joanie Mann Bunny FeetMake Sense?

J

IDG News Service — SAP has significantly improved the security of its products over the past few years but many of its customers are negligent with their deployments, which exposes them to potential attacks that could cripple their businesses, according to security researchers.

The biggest issue is that companies expose insecure SAP services to the Internet — not only HTTP services, but also critical administrative interfaces, Alexander Polyakov, chief technology officer at ERPScan, a developer of security monitoring products for SAP systems, said Tuesday.

Between 5 percent and 10 percent of companies that use SAP products expose critical services to the Internet that shouldn’t be publicly accessible, Polyakov said. This happens because they want to enable remote management or because of improper configurations, he said.

Most of the services have vulnerabilities that can be easily attacked, Polyakov said.

Publicly available exploits exist for many SAP vulnerabilities, including some that are part of Metasploit, a popular security testing tool.

The percentage of companies with exposed SAP services differs from country to country. The situation is better in North America and Europe and worse in the Asia-Pacific region, Africa and Latin America, Polyakov said. However, even 5 percent translates to a very large number of companies, he said.

via Many Companies Are Negligent About SAP Security, Researchers Say – CIO.com.

Preparing for Disasters of the Legal Kind

Preparing for Disasters of the Legal Kind

As businesses begin to realize the benefits of cloud computing and business data mobility, they may be overlooking one of the most important issues any enterprise can face: information management in the event of litigation.  While the IT department probably has a disaster recovery plan for handling various computer system failures, is there also a plan for managing system data and electronic information in the event of a “legal disaster”?  In the spotlight is e-discovery, which is the requirement of the business to respond to legal requests for electronically stored information, and the issues CIOs and business owners should be paying attention to as computing solutions and technology models continue to change at a rapid pace.

The popularity of BYOD (Bring Your Own Device), data sync solutions, and online collaboration tools has created an environment where business data may exist in various states (meaning as in conditions or status, not as in State, like California) and on a variety of devices and systems, some of which may not be in the direct control of internal IT.  Regardless of where or how the information was delivered to these devices and systems, CIOs and business owners should recognize that the information on those devices is included in discovery requests, and should be prepared with a plan for dealing with the response.

This “e-discovery plan” is the most important thing, and it means not only working through the various aspects of managing the information, but also providing consideration to keeping the plan updated.  As technology changes, and as user behavior changes along with it, businesses must adjust their IT management approaches in kind.  Consider that a user couldn’t store business data on their phone until the phone was able to handle that function.   Now that smartphones are the norm and tablet computers are gaining in popularity, business data is roaming on personal and business devices.  These advancements may introduce productivity and process gains which provide an advantage to businesses, but they also introduce potential risk and certain complexity when it comes to e-discovery.

Litigation is always expensive, but sanctions for slow response or other costs can be avoided if the plan helps the business respond in a timely manner.  For this reason, the plan should include an identification of all sources for information (every location where business information and data is stored), as well as the steps to be taken to preserve this data in the current state.  If the business has systems which regularly purge information (like accounting systems which purge prior period details, email systems which automatically purge old emails, or backup systems which delete old backup files as new ones are made), all of these activities must be halted.  If the company doesn’t have access to control the various devices and systems to prevent these activities (or doesn’t know that they are happening), significant risk is introduced.  In the case of a legal “hold”, all data and metadata and the audit controls and files must be preserved.

The final steps in the plan are the steps to be taken after the litigation is over.  This is often times a forgotten part of the plan, which is the final destruction of the information gathered for discovery.  Not that the original data must be destroyed (consider ALL dependencies), but the “database” of collected information related to the litigation probably should be.  With this data pooled in a single place, it becomes a potentially valuable target for a data breach.  At minimum, the collected information could too-easily be pulled into an entirely new legal case.

IT managers, CIOs and business owners must be realistic about the information their enterprises generate and store, including being realistic about the risk potential that duplicated and mobile data represents.  It is not that the enterprise should be afraid of allowing mobility and providing remote access solutions, but it is essential that the enterprise control the use of these solutions and how they use or interact with business data.   Without a strictly enforced policy of usage and control for all devices, services and solutions “touching” business data, any legal disaster planning falls short.

Joanie Mann Bunny FeetMake Sense?

J

Read More:

e-Discovery in the Cloud: Benefits versus Risks

e-Discovery in the Cloud: Benefits versus Risks

After many years of working with business professionals in “enabling” their organizations to make better use of technology, I must say that it is a bit frustrating trying to get folks to understand that this new and wonderful cloud computing model (or Internet-based computing, SaaS, or whatever-you-want-to-call-it computing) is still just technology.  It uses computers and disk drives, it runs software, it takes electricity, and it was developed by human beings.  It can break.   It’s not magical and perfect and you can’t get the good stuff for free.  Swim at your own risk.  So, assess the risks, and measure the benefits against the risks and costs.  For many, the benefits outweigh the risks, as cloud computing approaches can deliver advanced capabilities at cost levels not previously available to most businesses.

No industry is immune to the security and access considerations surrounding a cloud computing model.  Particular those lawyers involved in e-discovery (all of them) have recognizing the potential benefits – and tradeoffs – of the model.  This reality was clearly revealed at the ILTA (International Legal Technology Association) 2010 event in Las Vegas.  While the discussions at the conference were oriented specifically towards the legal profession, the IT-related discussions are totally relevant to every business.  Accounting and finance professionals should pay close attention to this type of conversation, as it relates very directly to accounting’s approach to information technology and the application of IT in the business or professional practice.

In a recap of the event entitled ILTA 2010 in Las Vegas: Strategic Unity, Defensibility, and the Cloud, author Chris Dale discussed that professionals in both public and corporate service must work with the IT departments towards a common goal.  “IT is no longer just a service department providing an infrastructure, applications, training, and troubleshooting.”  While these elements still remain as critical aspects of IT, the role has grown to also incorporate considerations for collaboration (collaborative information management), mobility, and social media.

Recounting one session attended, called Defensible Ediscovery Processes, the author related the variety of definitions provided to the general term” defensible”, which were pretty amusing.  These definitions ranged from protected against attack, to less lousy practices or practices which suck the least” (my personal favorite), and finally, what you can get away with without being found guilty of spoliation.  From these definitions then came qualifiers, such as “reasonableness” and “faith”.

Why would defensible processes be important, and how does this relate to IT or cloud computing?  An example of the element of “faith” came up in this context: ” how can [lawyers] have faith that the technology is delivering the right answers?”  A panelist gave the sample of “an email retrieved from (or possibly not retrieved from [love those lawyers]) a system, with 26.5 pages missing.  How can you be sure that the systems which you are using will not do that to you?”  These are valid questions in any IT environment, and are no less important when considering a cloud-based technology model.   The trade-offs are related to perfection in functionality and performance of the solution versus cost, and should be measured in proportion to one another.

The tradeoffs may come in a variety of areas, with collaboration and connectivity being the primary drivers (collaboration) and barriers (connectivity) to the model.  Businesses are more than ready to adopt cloud computing strategies based on the belief in improved collaboration, access to information, and improved IT management,  but tend to overlook the offsets in the areas of bandwidth availability (and consistency), application functionality (or lack thereof), and level of support available from the provider.  In support of this argument,  Jerry Justice (IT Director for SS&G – Certified Public Accountants and Advisers) posted in a LinkedIn discussion on the topic that “by design the Internet is ‘reasonably’ connected, but not the same as a well-connected [local] network.  the upside is it gives you the ability to connect from great distances, the tradeoff is that you experience variable connectivity.”

The underlying issues are that there is a paradigm shift to working on the Internet (from working in the office) and then another shift when you add in cloud-based environments (versus local apps).  It is possible to be very productive, but .. you have to adapt your approaches“.

The idea “that perfect must be qualified by cost and proportionality” was also discussed in an ILTA session on cloud computing which included panelists from Autonomy iManage, Mayer Brown, and Ernst & Young.  “Cloud computing remains a contentious area, with no obvious agreement even as to what the term means, let alone as to its implications” wrote Mr Dale in his recap of the event.  While the panelists held differing views, the representative from Mayer Brown held a position similar to Mr Dale, in that it is important to “dissect the objections one at a time, accepting that there is room for more than one view, and testing arguments against the alternatives.  Arguments based on pure cost are pretty compelling, and if one method of achieving an objective is very much cheaper than the others, then the burden shifts to those who argue for the more expensive route.”

Discussions went on to describe differences between public cloud providers and others, who segregate customer data in “private and identifiable silos”.  “The key word here is identifiable“, writes the author, “which connotes a geographical certainty as well as anything else.  I sometimes wonder if the imagery associated with cloud computing (invariably a jagged line disappearing into some cumulus) does not leave some people with the idea that their precious data is indeed floating in some inchoate container up in the air.”

If you neglect to provide in your contract that your data remains in a specified jurisdiction, and if you fail to conduct proper due diligence checks on the provider, then you deserve all you get.  Like any risk assessment, it involves weighing cost against other factors; most of these other factors are definable and quantifiable“.

I couldn’t have said it better myself.

Joanie Mann Bunny FeetJ

original post March 24, 2011

Moving Your Systems to the Cloud

The IT industry is promoting Software as a Service and online applications as the new normal for computing, and unless you’ve been living under a rock for the past few years you have heard how it is supposed to make our computing lives ever so much better.  Hiding under that rock might also have spared you from reading about the various failures and outages which impact users, forcing them to make do without the online applications and data they have become so reliant upon.  It’s surprising, but not unimaginable, that businesses rely so heavily on applications and services that didn’t even exist a few short years ago.

The potential benefits of a SaaS model are many, but the risks are equally significant and should not be minimized.  This assessment should center on a review of the application software in use, considering whether or not it is meeting the needs of the business.  Where and how the software runs is much less of an issue than the functionality and process support it provides – most “legacy” applications can be run in a cloud server environment, making remote access and managed service part of the service model.

There is risk in changing business applications – risk of data loss, changed or broken data relationships, lost productivity, and more.  Many businesses would benefit by running their applications in a cloud model while continuing to utilize the software solutions their operation relies on.

Application hosting models where desktop applications are delivered on cloud servers is  often overlooked when businesses go looking for cloud software because they are shopping for software and not the platform.

With Software as a Service (SaaS), the software and the platform are combined and together represent the solution. With application hosting on a cloud server, the software is the same software a business would traditionally run on PCs and servers, but the they are installed and managed on the cloud server rather than the local computers.

The big benefit is the agility of the platform and the user mobility it allows.  The unspoken benefit is that you can still “take your ball and go home” if the service doesn’t work out.

Removing the barriers for adopting an online working model allows the business to experience the benefits attached to cloud computing without introducing unnecessary risk through unneeded changes in software and applications.

Make sense?

J

 

Hosting All My Applications in the Cloud

Many business owners will recall when their first in-house computer networks were installed.  When the PCs were networked together in an office, it made file sharing and collaboration among team members easier and more efficient.  Installing additional applications on the PC was a relatively simple process, and when the new application came with the ability to integrate with another app already on the PC, it was often a fairly simple process to get the two “talking” together.  But installing and integrating applications on your personal computer is a bit different from getting multiple applications installed and integrated with a cloud hosting service provider.

In almost all cases, integrating multiple desktop software solutions requires installing those solutions on the same computer so that they can share certain program elements or, at least, share .ini or data files.  Application integration is important because it allows different software solutions to work together, communicating data from one application to the other so the information may be used in different ways or for different purposes.

An example of this might be a Microsoft Office integration with QuickBooks, which allows the user to perform a one-click export of QB financial data to an Excel spreadsheet.  Another example is the integration between QuickBooks and Fishbowl Inventory, which synchronizes information from the Fishbowl inventory system into the QuickBooks financial software.

In nearly every case where a software program has a software-based integration with another solution, the integration must be installed in the same system as the core solution.  In the QuickBooks world, this means that the programs which integrate with QuickBooks must be installed on the same computer as QuickBooks.

In a conventional PC network, the necessity of installing the various software solutions on the same machine is not a big problem as PC software and integrations have been implemented in this manner for years.  On the other hand, when the business is considering the option of moving desktop applications to the cloud, it is important to make sure the provider and service will allow all of your products to be hosted.  In most cases, this requirement highlights the main difference between a shared service versus a dedicated or server-based solution.

With shared services, the servers are generally configured to offer a strict and limited set of applications to be hosted.  The applications on the servers are used by subscribers of the service, and users are limited to accessing only those applications available in the environment.  The shared approach is popular with some application hosting providers as it creates an economy of scale which helps providers to earn more revenue on their infrastructure.    The trade-off is that a shared hosting solution only works well for businesses with a limited application requirement, and is generally fairly expensive when more users are added to the service.

The need for diversity in hosted application choices, coupled with the need for businesses to keep costs down even as the number of business users increases, are the primary drivers for adoption of dedicated and server-based cloud hosting solutions.  When the solution is managed as an entire environment rather than on an exclusively per-user basis, an economy of scale is developed within the organizational IT infrastructure.  As the business grows and adds more users and applications, the incremental costs to bring each user or application onto the platform is often far less than a user subscription in a shared solution.

For any business planning to migrate their server and systems to the cloud, the first step is to have a thorough understanding of the applications and integrations the business needs in the host environment, and then to find a hosting provider that can deliver the infrastructure and baseline system administration required.  It is unreasonable to expect a hosting provider to be an expert with every software product available, but skilled and experienced hosting providers understand how to generally install and implement most standard business applications and will rise to meet the customer demand.

While no business can guess what their future software needs may be, decisions can be reasonably made based on the solutions currently in use.  Finding a provider with a service to meet immediate needs is useful, but businesses change and therefore business requirements change, and it is good to know that the hosting infrastructure and IT services supporting the business can adjust to those changing needs.  After all, cloud hosting of applications and data just means the servers and infrastructure are with the service provider and not in the office, but it doesn’t mean businesses can’t have the feature-rich and functional applications their businesses have come to rely on.

Make sense?

J