AI FOMO and Your Business

“AI FOMO” (Fear of Missing Out) has become a major force behind business adoption of artificial intelligence.

Rather than pursuing AI with a clear strategy, too many organizations are investing because of competitive pressure, media buzz, and fear of falling behind. This reactive approach often leads to rushed, expensive, and poorly executed initiatives that fail to create real value—and can even spark internal friction.

Surveys show that a large share of IT leaders and executives—sometimes more than 60%—acknowledge that FOMO significantly influences their AI adoption decisions. This fear is fueled by rapid technological change, assumptions that competitors are gaining an advantage, and limited understanding of what AI can and cannot actually do.

Implementing AI without thoughtful planning or alignment to business needs often results in wasted investments in tools that don’t address real problems. Projects may stall in the early stages or fail to produce any measurable benefit or return on the investment.

Among the biggest challenges with AI centers on data and trust.

When a business puts speed of development above quality and security, it can lead to data errors, AI “hallucinations” and just plain wrong answers that diminish trust in AI systems. Workers may already feel threatened or undervalued, which creates anxiety and slows tech adoption, so care must be taken to not prematurely introduce AI that may further erode trust in the technology.

I’ve always understood that technology isn’t just a tool, it can be a strategic advantage helping businesses gain in ways not previously available. The key is to move away from fear-based adoption and toward a deliberate, value-driven approach.

Start with identifying the real business problem. With AI, figure out what problems you need the technology to solve for you rather than asking what AI can do. Just because AI can do something doesn’t mean you want it to do it for you, or that it will deliver any real value to your process or operation.

Change for the sake of change makes no sense, so it is essential to understand if there is actually a problem that AI may be able to solve and that the benefits of the solution outweigh the cost to develop and the risk potentially introduced. Start small and have pilot projects in low-risk but high-impact areas of the business where the organization can learn and refine before scaling.

Among the most important aspects of AI in business is the data the AI works with. This is where many businesses fail in their initial attempts with AI development, due largely to the fact that data is siloed or segregated and completely unclassified or categorized.

For AI development to deliver effective business benefit, high-quality, organized data and solid data infrastructure are essential.

AI systems learn directly from the data they are given. If the data is incomplete, inaccurate, inconsistent, or poorly managed, the AI’s performance will reflect those flaws. AI models are only as good as their data because AI systems—especially machine learning and generative AI—identify patterns and make predictions based on training data.

Poor-quality data results in biased, unreliable, or incorrect outputs. High-quality data supports accurate, trustworthy, and consistent results. If an AI is trained on inaccurate or inconsistent information, it will learn (and repeat) those errors.

Shift from a fear of missing out to a fear of missing the advantages of AI.

The focus should be on maximizing AI’s potential to create a competitive advantage, taking strategic risks that are aligned with the business goals. Replace fear-driven decision-making with thoughtful, goal-oriented planning and turn AI into a meaningful source of long-term value and differentiation rather than an anxiety-inducing trend to chase.

Noobeh cloud services works on the Microsoft Azure platform, creating data platforms and delivering services that fuel and support AI development. Let us create the dynamic data infrastructure your business needs to develop the intelligence to propel you forward.

jm bunny feetMake Sense?

J

Write it Once – The Value of Integration

It’s amazing how much time and energy continues to be spent on duplicate data entry and re-keying information generated by one system into another.  Human-based data entry is prone to errors, takes time, and carries with it the burdens of employee costs and resources.  It is a problem that businesses of all types have battled for years even though enabling solutions have been around for a while. 

Methods of integrating applications and data have existed for quite some time, and in recent years these methods have expanded to include a wide variety of platforms and more open standards-based approaches.  Even in the small business world, business owners using traditionally limited software products can enjoy sophisticated extensions and integration of their applications and business data.

To provide a simple example of the problem: when an individual writes a check, that check must be recorded for several purposes including the recording of the cost or expense as well as the reduction of funds in the bank account.  When a product is sold to a customer, inventory is relieved, sales are increased, accounts receivable or cash is increased, costs of goods sold are experienced, and customer activity is captured.  All of this information must be recorded, and the activity accounted for throughout the financial and operational systems and can represent a tremendous burden if not automated. This also means that data exists in a variety of places, increasing the challenges of information collecting and reporting.

 Cloud-based integration and infrastructure services such as DBSync and Microsoft Azure enable seamless collection, transformation, aggregation and storage of business data. Whether linking accounting with sales CRM or pushing financial and operational data to an Azure data warehouse for analytics, Noobeh and DBSync provide the data engine and the infrastructure to put it all together.

A small business owner’s situation offers a direct illustration. He sells computer parts through an ecommerce website.  Orders from this website are emailed to his order operators, who then turn around and re-key the orders into their accounting system where the inventory is also tracked.  Because of the increasing number of sales orders and product purchase orders to enter on a regular basis, there were three operators working in the department responsible for making sure website orders make it into the accounting system. Orders were frequently missed or misplaced, entry errors caused problems in accounting and product delivery, customer satisfaction went down, and the cost of handling web orders was increasing.

By implementing a single software solution, the company was able to not just address the current problem, but was set up to seamlessly increase business without increasing headcount. The solution was a system which takes transaction data from the ecommerce system and imports it into the accounting/ERP system. This single step allowed the business to reduce and redirect personnel costs, improve accuracy and timeliness of data entry, and increase customer satisfaction as well as overall business performance.

In even a small company, one piece of information may be used in a variety of ways and in a variety of systems. This complexity is found in simple business models as well as larger and more complex enterprises, revealing the value of integration solutions and automation tools at every level of operation.

Mendelson Consulting and Noobeh cloud services recognize that every business needs the right information at the right time to operate effectively. Our expert teams help businesses implement the solutions which bring business data together, empowering workers to be more productive and giving stakeholders the decision-support tools they need.

jm bunny feetMake Sense?

J

Timing is Everything: Security, AI and the Tech Stack in Restaurants

Running a restaurant or chain of restaurants is no easy task. Margins are often razor thin while customer demands continue to expand. Food service, like so many other industries, is struggling to bear the weight of change as labor shortages, rising costs, increasing cybersecurity risk, and demands for an improved customer experience push the industry to do more efficient and effective business. To keep up, businesses must learn more about what’s really going on in the operation, and to turn that insight into action. Timing is everything, and now is the right time to look at technology and platforms which will deliver greater insight and intelligence.

Whether it is leveraging commercial solutions available from partners or through building the tech stack by DIY, businesses in the restaurant industry are looking for innovative solutions to drive more profitability as well as increasing revenues leveraging resources they’re already paying for. AI is playing a big role in this evolution and businesses of all sizes, from the single-location entrepreneur to the multi-location franchise, are taking steps to incorporate it into the operation. Yet AI is all but useless if it doesn’t have the data to analyze.

An example of a transformational solution for the restaurant industry might be Curbit’s products, which include digital infrastructure and real-time AI capabilities that enable the software to analyze the data around service, real-time order progress, kitchen performance and guess sentiment. Microsoft’s Azure and Azure AI platform is key to Curbit’s innovation and development, enabling them to give the information which powers timely decision-making rather than offering only after-the-fact reports or outdated dashboard data.

In the category of maximizing the resources you’re already paying for, look at new services available through DoorDash, where lunch special and happy hour promotional offerings help businesses drive demand in off-peak times. Designed to increase revenues and drive greater sustainability, the service also reflects how restaurants are increasingly faced with the need to leverage online tools and mobile ordering to replace the reduction in foot traffic at brick and mortar location.

On the other side of that coin is information security and privacy and how businesses deal with the realities of cybersecurity threats and the need for greater levels of protection. Collecting more data for analysis means there is an increased risk of exposing private data if not adequately protected.

Considering high profile incidents like what happened with Panera, adequate cybersecurity protections must be part of the essential infrastructure that supports the operation. What was initially described as a systemwide technical outage at Panera was ultimately found to have been a cybersecurity breach exposing some employee personal data and the basis for a class action lawsuit filed by Panera employees.  

Whether it belongs to employees or customers or others, personal and private data must be protected. The cost of protecting the data is likely lower than the cost of dealing with a data breach and the potential resultant backlash, which is another part of the equation which must be considered.

Mendelson Consulting and Noobeh cloud services recognize how businesses need to modernize their systems, developing greater intelligence and resilience in the operation. We also recognize the importance of redundancy and agility in systems, and how quality managed cybersecurity solutions and services help guard against attack.

From ERP and specialized business solutions to platform, hosting and managed service, Mendelson Consulting and Noobeh cloud services can help your business meet the demands of doing business now and in the future.

jm bunny feetMake Sense?

J

Are You Prepared for SQL Server 2008 End of Support?

 

Everything gets old eventually, and now it is official for SQL Server 2008.

03-2012sean-phone-328-e1377042261105On July 9, 2019, support for SQL Server 2008 and 2008 R2 will end. That means the end of regular security updates and general support for the product. Are you ready?

It took more than 10 years for Microsoft to end support for our beloved SQL 2005 and version 2008 has enjoyed a similarly long reign. But it’s over and you need to get used to the idea. Even more, you need to get upgraded to a new version of SQL so your systems can still be patched, updated and supported. With all the nasty exploits out there, letting your software get out of date is more of a business risk than ever.

With cyberattacks becoming more sophisticated and frequent, running apps and data on unsupported versions can create significant security and compliance risks. The 2008 family of products was great for its time, but we highly recommend upgrading to the most current versions for better performance, efficiency, and regular security updates.

Now is a Good Time to Consider Azure

Microsoft is giving a present to businesses that want to migrate their workloads to Azure. For those customers that elect to take this as an opportunity to move to the Azure cloud, extended security updates will be available for free in Azure for 2008 and 2008 R2 versions of SQL Server and Windows Server to help secure workloads for three more years after the end of support deadline. Moving existing systems to the Azure cloud is a natural step in modernizing the business infrastructure and makes the next step of upgrading to managed database services and/or migrating to new Azure servers a lot easier.

Upgrading isn’t simply a matter of maintaining status quo, either.

Moving to new versions can be a foundation for new strategic capability and increasing overall business potential, powering new decision-making processes fueled by analytics and business intelligence.

The Microsoft Lifecycle Policy offers 10 years of support (5 years of regular support and another 5 years of extended support) for the 2008 and 2008 R2 versions of SQL Server and Windows Server. When the extended support period ends, there will be no patches or security updates, which always creates security risk.

If your business is going to remain competitive, you can’t rely on outdated systems.

Your business is tough enough to manage without having your systems work against you.  Software that prevents you from keeping up with demand, creates risk in compliance and security, and reduces operational performance is not what you need. Collecting, storing and rationalizing data takes power and speed, and securing your growing information warehouse requires vigilance in security and update management.

Use this opportunity to review your platforms and applications, and consider moving your on-premises or co-located systems to the cloud. The upcoming milestone is a great opportunity to transform applications and infrastructure to take advantage of cloud computing and the latest versions of SQL Server and Windows Server.

jmbunnyfeetMake Sense?

J

QuickBooks Point of Sale and Hosting

QuickBooks Point of Sale in a Hosted Environment

Retail operators and multi-location store owners often face difficulties in attempting to bring cohesion to their accounting, financial, and operational data.  In so many situations, the retail location –  where inventory is sold and money is exchanged – is far-removed from the administrative location where the financial systems and business reporting exist.  It seems that the best case scenario is to create a means for the remote (retail) locations to operate with real-time access to centralized customer, inventory, and financial data from a primary source. Application hosting services can provide this centralization,  and a platform for standardization, of systems.  Further, the application hosting model can deliver security and managed service which ensures that the systems are available and performing as required.

Even though hosted applications and centralization of the systems and processes in a POS environment may appear to be the right answer, there are caveats and considerations that speak to the realities of today’s technologies.  These caveats should be strongly considered prior to undertaking any reformation of systems and processes relating to the retail locations.

The first fundamental reality which must be addressed is connectivity.

While a retail or store location may enjoy Internet or network connectivity, there should be great consideration given to the wisdom of connecting these locations only and exclusively via remote access systems.

Retail is a dynamic business, and the sale is made when the customer is ready and willing to buy.  Any retail location must be able to process this sale in order to meet the immediacy of customer demand.

 

If the systems in use are exclusively accessed remotely, then the connectivity to those systems become of paramount importance in the ability to do business.  At the very minimum, any remotely-served retail location should have redundant connectivity options, with local personnel being familiar with the connection failover process.

A second strong consideration for a hosted or remotely-deployed POS or retail system is local device support.

Devices, such as card readers, scanners, cash drawers, receipt printers, etc. typically require local PC/computer drivers in order to function.  When served by a remote system, this connection between the host and the local devices may not function.  Limited device support for POS hardware can significantly impact the location’s accuracy and efficiency.

QuickBooks POS was designed for use on a single-user PC environment.  The application is not well-suited to a hosted deployment for multiple users, as the software only allows one instance of itself to run on each computer.  This alone eliminates the benefits of a server-based computing model for POS, whether onsite or hosted. The multi-lane option requires all stores to be connected via the same LAN, so remotely connecting multiple locations isn’t really do-able, either.  This is why there is a multi-store option, allowing the various stores to operate independently and send the daily data back to a master location via a store transfer or email process.

In many cases, the suitable answer is to keep the POS systems running on the local computers and network, and run the accounting applications on the host. The host system, whether it be an on-premises server or a location in the cloud, could also run the software which integrates the POS data with accounting.

integratedFor example, with an installation of QuickBooks accounting the point-of-sale “master location” on the host, the core financial data is able to be secured and protected in the virtual environment without risking lost productivity (and lost sales!) due to connectivity failures at the retail locations.  The end-of-day process at each location is to then copy the POS data to the host system where it is integrated with the accounting system. If the POS system is something other than QuickBooks POS, it simply means that there is another piece of software – the specific POS integration tool – required to transfer the POS data into the accounting software.  QuickBooks desktop accounting integrations are available for most popular POS systems including Micros, POSiTouch, Aloha and others. The integration software (often just a QuickBooks plug-in) would be installed on the computer running QuickBooks, enabling the entry of the POS data into the QuickBooks accounting system.

It makes a ton of sense to centrally manage the accounting and financial data for the business, in a secure location away from the retail storefront and frontline workers.  It’s just that the accounting is easier to host and makes more sense to run as a centrally-managed, hosted solution.  POS, on the other hand?  Not so much.

For a small market vendor or the largest of retail stores, point of sale needs to be up and running at all times, driving receipt printers and cash registers/drawers and barcode scanners. Run the POS system on-premises where the action happens, but keep accounting and finance safe and secure somewhere else.

jmbunnyfeetMake Sense?

J

Easy deployment in the cloud: What about users and applications?

Easy deployment in the cloud: What about users and applications?

cloudpagingBusinesses are migrating their systems to the cloud, it’s true.  Organizations of every size and type are taking advantage of the cost savings and flexibility introduced with cloud deployments and hosting services.  Rather than focusing efforts on procuring, installing and maintaining servers and applications in-house, IT departments are moving workloads offsite to cloud providers and hosted platforms.  The tools are readily available to help these IT workers configure and light up VMs in hosted infrastructure, and certain platform licenses and other elements are made accessible to customers.  But there’s something missing in the toolsets provided by platform hosting companies – a certain something that ultimately determines how useful (or not) the hosting platform service is when IT is ready to deploy users and applications in the environment.

Conceptually, hosting services are supposed to provide a centralized management and administrative capability for an organization.  While this is true in the context that most of the system and resources are assembled inside the datacenter, proximity alone doesn’t make things easier to manage.  In fact, some virtualization and delivery models can exacerbate issues that IT at least had a known way of dealing with when it was in-house.

Consider that, even in hosted and virtualized infrastructure, everything that needed to be done to build the in-house network still has to be done – only now it involves the on-premises computers (sometimes with client software still requiring installation and management), the local LAN, the Internet, the datacenter facility and network, and computers and software in the datacenter.  Most of the complexity may reside in the datacenter with the hosted systems, but even that scenario isn’t necessarily plug-n-play.  IT must still bring up the servers, and then the fun begins.  Fun, in this case, means setting up policies and permissions, users, and applications.  The unfortunate thing is that there are few tools being made available which directly and specifically address this requirement for customers in hosted infrastructure.  Hosted customers are still burdened with the requirement to not only establish and manage their permissions and user accounts – they also have to still install, update and maintain application software in the environment.

Most IT teams recognize that installing an application once is way better than having to install it a bunch of times, so there is a tendency to lean towards hosting models where a single (or few) machines service desktop and application sessions for lots of users.  Reducing the number of actual application installations, this approach (such as with terminal services) can make software implementations go a bit easier than if the app had to be installed across a lot of machines.  On the other hand, there is a fine art to implementing some applications in terminal server environments, and not all apps behave well in the delivery model.  Many engineering hours have been spent trying to get user apps working on terminal servers – sometimes much more time than if the application were simply installed to multiple PCs.  On an ongoing basis, technicians fight with applications and broken functionality, wishing the entire time that they could bypass the terminal services issue and get back to working with individual machines and app installs.  At least they knew the apps would work.

Companies determining that a VDI or DaaS solution would more directly mirror the individualized PC approach quickly find that managing and maintain the working user environment, including the variety of applications and functionality demanded by entry-level and power users alike, is just as complicated and time-consuming as it was when they were managing individual user PCs.  And, lacking quality software distribution and lifecycle management tools in the platform, find that template-based VM imaging doesn’t go far enough in terms of easing the burdens of installing, updating and maintaining applications on a user machine, whether it’s the local PC or a managed VM.

The truth about many cloud solution offerings and hosting platforms is that they are often oriented towards the enterprise customer and IT department, expecting that the customer has the skills and capability required to do the right things in deploying the hosted solution for the company.  Leaving all of the time-consuming aspects of service management and delivery to the customer – the parts of the delivery which address the actual users, desktops and applications – simply shifts the location of work for IT, but not necessarily the nature of the work.   They’re still going to spend a bunch of time not just setting up groups and users and applications; they’re going to spend a bunch of time managing and maintaining them, just like they always have.

There should be smart solutions to these problems – tools which could be made available to customers having a desire to deploy their operations in hosted infrastructure and that deliver the automation and ease of management which enables IT to realize gains through process efficiencies at all levels of the deployment.  The heavy lifting isn’t buried in the building of a server.  The heavy lifting – the grunt detail work that nobody really wants to deal with – exists around groups, users and applications.  Get some truly useful automation tools in those areas, and hosting becomes even more viable and beneficial for value added resellers, IT departments, and their users.

jmbunnyfeetMake Sense?

J