Avoid the Aftertaste| QuickBooks Desktop Hosting Comes in Many Flavors

Avoid the Aftertaste| QuickBooks Desktop Hosting Comes in Many Flavors

There is a lot of activity and interest around the hosting of desktop applications in the cloud, and it is no wonder that a great deal of the effort centers on the use of Intuit QuickBooks desktop editions.  QuickBooks is among the most popular software products used by small businesses, so it makes sense that service providers and hosting companies are taking advantage of that market share to reach prospective hosting customers.  After all, a hosting platform may be kind of neat, but it is not all that valuable unless there are applications and data living on it.

For the average small business, the applications of choice include Microsoft Office and QuickBooks.  Yes, there is an online edition of the QuickBooks product (called QuickBooks Online, of course).  However, the market share Intuit earned for QuickBooks wasn’t accomplished with an online application, it was done with the desktop applications which still own market share today.  Hosting service providers recognize this truth, and are taking steps to bring those QuickBooks desktop solutions into the cloud.  Now we have the ability to get QuickBooks Desktop editions online – which is not the name of a service but a description of what it offers – available from a variety of authorized hosting providers (and from many unauthorized ones).

I’ve said before that there is a fine art to hosting QuickBooks desktop for lots of users.  There are a great many different considerations and possible use cases, and not all providers will be able to meet every requirement.  There are also lots of different technology models and methodologies which may be applied to the hosting model, and each has some benefit or barrier depending on the specific need of the client.  Hosting companies may throw around terms like “cloud server” or “published application” or “remote desktop”, but at the end of the day, the systems are still Windows computers running QuickBooks software.  How those systems are wrapped up, how you connect to them, and how you operate with them often becomes the real difference in the service experience.

The specific technology a hosting provider applies to the service does not necessarily describe exactly how the service works.  Just because a provider may use Citrix doesn’t mean they have more capability to provide quality service than a provider using other technologies, or a host using VMWare is not necessarily creating better cloud servers than a host using Hyper-V or Parallels or some other virtualization strategy.  The technology may impact how the infrastructure is operated and can impress upon the customer experience, but the real differences in delivery often come down to the provider’s understanding of the software product, the customer need, and their ability to meet the need directly.

Does the experience of connecting to and using the service work for the users, and are people able to get their jobs done quickly using the service without a lot of support or frustration?  (**Please note that hosting services aren’t a solution for bad software and poor working processes.  If the software or processes aren’t workable now, they’re likely not going to become magically more workable if hosted).  Does the hosting service address issues like making the right data available to only the users who need it, and giving access to applications only when a user is permitted to use them?  What about “external” users like contractors or client businesses… does the host offer a way for them to also participate in the solution?

It’s important to consider all of the aspects of how the service will be used, and by whom and under what circumstances, to ensure that the delivery offered is the solution needed. The point of all this is to encourage users to concern themselves a little less with exactly what technology the host is using to deliver QuickBooks applications, and to evaluate the actual solution.  It won’t typically matter to an end-user what specific technology is being used to provide them with service as long as the service works well for them.

While some people do adopt a fondness for a particular “flavor” of technology or approach, the reality is that a quality user experience coupled with a useful and reliable system means much more to the business.  And knowing that there are future options for growing, expanding or simply changing the service is essential.  It’s not so much the flavor of technology users should be concerned with when shopping for QuickBooks hosting services, it’s avoiding that icky aftertaste that comes with selecting a QuickBooks hosting approach that just doesn’t meet the business need.

Make sense?

J

Cloud Computing Evolved: Disruptive Technology Goes Mainstream

jmbunnyfeetCloud Computing Evolved: Disruptive Technology Goes Mainstream

A 2010 information technology report by IDC (International Data Corporation, a global provider of market intelligence) provided a few interesting predictions for Information Technology in these changing times.  Not surprisingly, many of the predictions centered around the same “ingredients of IT industry transformation” which were identified in past years as being disruptive technologies including cloud computing, mobile devices and applications, wireless broadband, virtualized infrastructure, social networking, and smart devices being among those listed.  The subsequent 2011 report suggested a continuing trend of spending and innovation in cloud technologies and mobile computing and in collecting and analyzing the huge volumes of data being generated.  It is clear that cloud computing is evolving from being disruptive technology to mainstream IT.

Everyone must by now recognize the significant growth in use of online and mobile applications and services.  If you haven’t noticed that just about everyone has a smart phone or tablet computer, then you’ve got your head buried deep in the sand.  What this clearly indicates, and IDC supported the position with quantifiable evidence, is that the “disruptive technologies” of yesterday have transitioned from early adoption to mainstream adoption.   This means that use of these technologies had pushed “well beyond” the first 10 to 15% of the market through 2010, and that customers were ready to integrate these new solutions as core parts of their overall IT strategy.

If you don’t believe that cloud computing, virtualization, and mobile access are becoming (have become?) mainstream, consider the staggering number and variety of mobile devices and networks available today.  The adoption of these devices is driven by the availability of broadband wireless service, and their use is fueled by applications offering “social business” and “pervasive analytics”.  No longer limited as a voice communications device, the mobile phone has now become the mobile workstation, capable of supporting a wide variety of business and personal interactions and functions intended to help users generate and analyze “unprecedented volumes of information” – and the 2011 report indicates that mobile computing is continuing to fuel the trend.

“Mobility wins” will be the top theme of the year as mobile devices outship PCs by more than 2 to 1 and generate more revenue than PCs for the first time. 85 billion mobile apps will be downloaded, and mobile data network spending will exceed fixed data network spending for the first time.

IDC’s 2010 report placed an interesting focus on the impact of this new era of IT, believing that it would be a launchpad for  the creation of “intelligent industry” with an IT-enabled “intelligent economy”.  This doesn’t apply only to those very large multinational corporations, like the IBM commercials about a smarter planet and the commercials where the box tells us where it (and the delivery truck) is.  This new-found intelligence would allow businesses of all sizes to offer better and more customized services locally while dramatically expanding their market reach beyond geographic boundaries, and positioning themselves for accelerated growth.

As the number of intelligent communicating devices on the network will outnumber “traditional computing” devices by almost 2 to 1, the way people think about interacting with each other, and with devices on the network, will change. Look for the use of social networking to follow not just people but smart things.”

Business owners who find a way to leverage this new capability through innovative applications of cloud computing and mobile device access will almost certainly find that their businesses are better suited to addressing the needs of their current market, but are also poised to take advantage of emerging opportunities in emerging markets as well.

In 2010 IDC predicted that by 2012 we would begin to see the “slow death” of cloud computing – the term, not the technology model.  Even though cloud computing is one of the hottest buzzwords in tech today, the model is becoming mainstream to the point where it is no longer considered a bleeding-edge method of computing requiring its own descriptive name. While IDC may have been a bit off in terms of forecasting the slow death of “cloud” terminology, their finding that the evolution of cloud computing models is rapidly progressing from disruptive to mainstream appears to be spot on.

Joanie Mann Bunny FeetMake Sense?
J

updated from original post in 2010

Preparing for Disasters of the Legal Kind

Preparing for Disasters of the Legal Kind

As businesses begin to realize the benefits of cloud computing and business data mobility, they may be overlooking one of the most important issues any enterprise can face: information management in the event of litigation.  While the IT department probably has a disaster recovery plan for handling various computer system failures, is there also a plan for managing system data and electronic information in the event of a “legal disaster”?  In the spotlight is e-discovery, which is the requirement of the business to respond to legal requests for electronically stored information, and the issues CIOs and business owners should be paying attention to as computing solutions and technology models continue to change at a rapid pace.

The popularity of BYOD (Bring Your Own Device), data sync solutions, and online collaboration tools has created an environment where business data may exist in various states (meaning as in conditions or status, not as in State, like California) and on a variety of devices and systems, some of which may not be in the direct control of internal IT.  Regardless of where or how the information was delivered to these devices and systems, CIOs and business owners should recognize that the information on those devices is included in discovery requests, and should be prepared with a plan for dealing with the response.

This “e-discovery plan” is the most important thing, and it means not only working through the various aspects of managing the information, but also providing consideration to keeping the plan updated.  As technology changes, and as user behavior changes along with it, businesses must adjust their IT management approaches in kind.  Consider that a user couldn’t store business data on their phone until the phone was able to handle that function.   Now that smartphones are the norm and tablet computers are gaining in popularity, business data is roaming on personal and business devices.  These advancements may introduce productivity and process gains which provide an advantage to businesses, but they also introduce potential risk and certain complexity when it comes to e-discovery.

Litigation is always expensive, but sanctions for slow response or other costs can be avoided if the plan helps the business respond in a timely manner.  For this reason, the plan should include an identification of all sources for information (every location where business information and data is stored), as well as the steps to be taken to preserve this data in the current state.  If the business has systems which regularly purge information (like accounting systems which purge prior period details, email systems which automatically purge old emails, or backup systems which delete old backup files as new ones are made), all of these activities must be halted.  If the company doesn’t have access to control the various devices and systems to prevent these activities (or doesn’t know that they are happening), significant risk is introduced.  In the case of a legal “hold”, all data and metadata and the audit controls and files must be preserved.

The final steps in the plan are the steps to be taken after the litigation is over.  This is often times a forgotten part of the plan, which is the final destruction of the information gathered for discovery.  Not that the original data must be destroyed (consider ALL dependencies), but the “database” of collected information related to the litigation probably should be.  With this data pooled in a single place, it becomes a potentially valuable target for a data breach.  At minimum, the collected information could too-easily be pulled into an entirely new legal case.

IT managers, CIOs and business owners must be realistic about the information their enterprises generate and store, including being realistic about the risk potential that duplicated and mobile data represents.  It is not that the enterprise should be afraid of allowing mobility and providing remote access solutions, but it is essential that the enterprise control the use of these solutions and how they use or interact with business data.   Without a strictly enforced policy of usage and control for all devices, services and solutions “touching” business data, any legal disaster planning falls short.

Joanie Mann Bunny FeetMake Sense?

J

Read More:

e-Discovery in the Cloud: Benefits versus Risks

e-Discovery in the Cloud: Benefits versus Risks

After many years of working with business professionals in “enabling” their organizations to make better use of technology, I must say that it is a bit frustrating trying to get folks to understand that this new and wonderful cloud computing model (or Internet-based computing, SaaS, or whatever-you-want-to-call-it computing) is still just technology.  It uses computers and disk drives, it runs software, it takes electricity, and it was developed by human beings.  It can break.   It’s not magical and perfect and you can’t get the good stuff for free.  Swim at your own risk.  So, assess the risks, and measure the benefits against the risks and costs.  For many, the benefits outweigh the risks, as cloud computing approaches can deliver advanced capabilities at cost levels not previously available to most businesses.

No industry is immune to the security and access considerations surrounding a cloud computing model.  Particular those lawyers involved in e-discovery (all of them) have recognizing the potential benefits – and tradeoffs – of the model.  This reality was clearly revealed at the ILTA (International Legal Technology Association) 2010 event in Las Vegas.  While the discussions at the conference were oriented specifically towards the legal profession, the IT-related discussions are totally relevant to every business.  Accounting and finance professionals should pay close attention to this type of conversation, as it relates very directly to accounting’s approach to information technology and the application of IT in the business or professional practice.

In a recap of the event entitled ILTA 2010 in Las Vegas: Strategic Unity, Defensibility, and the Cloud, author Chris Dale discussed that professionals in both public and corporate service must work with the IT departments towards a common goal.  “IT is no longer just a service department providing an infrastructure, applications, training, and troubleshooting.”  While these elements still remain as critical aspects of IT, the role has grown to also incorporate considerations for collaboration (collaborative information management), mobility, and social media.

Recounting one session attended, called Defensible Ediscovery Processes, the author related the variety of definitions provided to the general term” defensible”, which were pretty amusing.  These definitions ranged from protected against attack, to less lousy practices or practices which suck the least” (my personal favorite), and finally, what you can get away with without being found guilty of spoliation.  From these definitions then came qualifiers, such as “reasonableness” and “faith”.

Why would defensible processes be important, and how does this relate to IT or cloud computing?  An example of the element of “faith” came up in this context: ” how can [lawyers] have faith that the technology is delivering the right answers?”  A panelist gave the sample of “an email retrieved from (or possibly not retrieved from [love those lawyers]) a system, with 26.5 pages missing.  How can you be sure that the systems which you are using will not do that to you?”  These are valid questions in any IT environment, and are no less important when considering a cloud-based technology model.   The trade-offs are related to perfection in functionality and performance of the solution versus cost, and should be measured in proportion to one another.

The tradeoffs may come in a variety of areas, with collaboration and connectivity being the primary drivers (collaboration) and barriers (connectivity) to the model.  Businesses are more than ready to adopt cloud computing strategies based on the belief in improved collaboration, access to information, and improved IT management,  but tend to overlook the offsets in the areas of bandwidth availability (and consistency), application functionality (or lack thereof), and level of support available from the provider.  In support of this argument,  Jerry Justice (IT Director for SS&G – Certified Public Accountants and Advisers) posted in a LinkedIn discussion on the topic that “by design the Internet is ‘reasonably’ connected, but not the same as a well-connected [local] network.  the upside is it gives you the ability to connect from great distances, the tradeoff is that you experience variable connectivity.”

The underlying issues are that there is a paradigm shift to working on the Internet (from working in the office) and then another shift when you add in cloud-based environments (versus local apps).  It is possible to be very productive, but .. you have to adapt your approaches“.

The idea “that perfect must be qualified by cost and proportionality” was also discussed in an ILTA session on cloud computing which included panelists from Autonomy iManage, Mayer Brown, and Ernst & Young.  “Cloud computing remains a contentious area, with no obvious agreement even as to what the term means, let alone as to its implications” wrote Mr Dale in his recap of the event.  While the panelists held differing views, the representative from Mayer Brown held a position similar to Mr Dale, in that it is important to “dissect the objections one at a time, accepting that there is room for more than one view, and testing arguments against the alternatives.  Arguments based on pure cost are pretty compelling, and if one method of achieving an objective is very much cheaper than the others, then the burden shifts to those who argue for the more expensive route.”

Discussions went on to describe differences between public cloud providers and others, who segregate customer data in “private and identifiable silos”.  “The key word here is identifiable“, writes the author, “which connotes a geographical certainty as well as anything else.  I sometimes wonder if the imagery associated with cloud computing (invariably a jagged line disappearing into some cumulus) does not leave some people with the idea that their precious data is indeed floating in some inchoate container up in the air.”

If you neglect to provide in your contract that your data remains in a specified jurisdiction, and if you fail to conduct proper due diligence checks on the provider, then you deserve all you get.  Like any risk assessment, it involves weighing cost against other factors; most of these other factors are definable and quantifiable“.

I couldn’t have said it better myself.

Joanie Mann Bunny FeetJ

original post March 24, 2011

The race to find the “secret sauce” for hosted application services for small business

The race to find the “secret sauce” for hosted application services for small business

Cloud computing is here to stay.  What was once viewed as bleeding-edge technology fraught with peril and risk is now recognized as an emerging standard for application deployment and delivery.  The race to the clouds represents a significant challenge, however, when issues of application interoperability and integration are introduced – particularly when it comes to small business solutions which traditionally reside on the local PC and network.

Today’s available technologies and platforms quite nicely facilitate single application deliveries, yet frequently fail to address the dynamic provisioning and deployment requirements of a rich integrated application environment.  Users who desire to select from a variety of applications in a hosted environment are most often met with barriers which won’t allow them to have the particular mixture of solutions they need.

While virtualization approaches for platforms and applications are gaining popularity and increasing in capability, the reality of the problem still rests with the fact that, in order for the applications to integrate, they must be installed on the same machine.  There is a race amongst the virtualization platform providers to find the “secret sauce” of application hosting; to enabling a flexible, dynamic, rich application delivery method which overcomes the need to have integrated applications installed together on servers in pre-selected ‘packages’.  With the secret sauce, the provider would be able to offer the customer any possible combination of available applications, and offer them as fully integrated solutions, regardless of whether or not those applications were actually installed on a machine together somewhere.

Currently, the solution is addressed (sort of) in how the provider deals with three main elements in the service model, which are packaging, provisioning, and business rules.  With these three ingredients appropriately approached in a flexible infrastructure and partner network, the potential for broad hosted application delivery and distribution exists.   Service providers are still stuck with the requirement to pre-select their various partner or integrated application inclusions, but it is possible to offer the perception of maximum flexibility without actually having it.  The challenge is not presented with the business rules, but in the packaging and provisioning processes.

Packaging is the step where the item to be provisioned is combined with other elements, resulting in a service or installation “package”.  Much like a manufacturing assembly process, packaging takes into consideration the total resource utilization or requirement, accounting for all resources combined into or used to create the package.  Packaging cannot be performed without first understanding, at a detailed level, what can be provisioned and how.  With the variety of applications, data services, implementation methodologies and models which exist, a single method approach has proven to not address a majority of software products currently available on the SMB market, and is unlikely to in the near future.

An example of this challenge is partially revealed when we look at the diversity of applications involved in the Microsoft Office ecosystem.  Many businesses rely upon the functionality present on the Office suite desktop products, such as Word, Excel and PowerPoint.  For some application users, this functionality is not present in their primary use software, but is presented via desktop level integration methods.  In order to deliver the full functionality and capability of the primary use application, the installation and integration support for the Office applications must also be provisioned and packaged into the service.   Offering even limited integrations and options like Office for inclusion in the package can introduce challenges in data access and management, permissions and file level security, and ISV licensing of applications.  For these and other reasons, attempting to provide a rich, user-selected mixture of deliverables poses the ultimate challenge to the application service provider.

When a cloud or application hosting provider can offer their customers the ability to sign up for, purchase, and utilize without complication or delay their selections of desktop or web-based applications, services and integrations, and pay for them as a subscription service accessible at any time and from anywhere – that’s the secret sauce of application hosting and cloud IT that everyone’s looking for.

Make sense?

J