Chances are, in an average day, you are not accomplishing as many tasks as you would like … and neither are your colleagues or your employees. What is mystifying about that statement is that it seems today’s workforce is putting in more hours and more effort than ever before coinciding with an increased adoption of IT devices and applications designed to improve user productivity. In fact, this has been a key driver for organizations to enable workforce mobility – to provide flexibility in accessing business IT resources (applications, data, email, and other services) from any device at any location at any time in order to improve overall business performance. But even the most accomplished business professionals must admit there are days when little gets done despite herculean efforts.
It’s time to take a serious look at Office 365. The cloud edition of Microsoft’s broadly adopted business productivity suite – which bundles such popular packages as Microsoft Word, Microsoft Excel, Microsoft PowerPoint, and Microsoft Outlook – has been both heavily praised and heavily criticized since its introduction in 2011. While the adoption rate of the traditional software edition of Microsoft Office is currently in no danger of being overtaken by its cloud-hosted cousin, recent adoption rates for Office 365 have substantially accelerated. Businesses, in particular, have shown increased interest in the cloud-based platform, and many are carefully considering whether to make the transition after existing Enterprise Agreement (EA) licenses expire.
In my last blog, I talked about “IT Cultural Transformation and the Elimination of Technology Silos.” That blog keyed on four key areas of advice, which also provide a useful foundation for the topic of “Governance and Optimization.” These key areas include: Standing in the middle of the storm –This means looking at the interdependencies [...]
Business Process Management in the Real World — Why It’s Important to Govern Both Automated and Manual Processes
In a perfect world, all business processes would be automated and all work tasks would be accomplished with the click of a button. This idyllic work experience seems to be the realization of Plato’s utopia…or, if you prefer, the world of the Jetsons. Regrettably, however, we clearly do not live in a perfect world. Put simply, while any repeatable process can be automated, not every process is repeatable, so automation is not a practical solution in all cases. This is particularly a problem for enterprises since business productivity is almost entirely dependent on the rapid and accurate performance of business processes.
Though cyber attacks have been around for years, in 2014 there was an explosion in the volume of attacks and a marked increase in the losses and damages they inflicted. In 2015, this does not seem to be lightening up.
IT operations managers are cringing all around the world – desperately trying to avoid those inevitable words from their executive management: “You need to support enterprise mobility.” Their concerns are understandable. After all, IT administrators are already overtaxed with supporting desktop, server, application, and infrastructure management requirements. Asking them to layer a whole new management discipline on top of that can be a daunting prospect. IT managers who find themselves in this predicament often recognize it as an opportunity to practice the fine art of procrastination. Particularly skilled procrastinators will employ one or more of the following excuses:
IT Cultural Transformation and The Elimination of Technology Silos—An Exercise in Efficiency or a Dream Turned Nightmare?
Cultural transformation and eliminating IT silos may sound like an impossible dream—and indeed, perhaps “eliminating” is too strong a word. But the reality is that IT organizations must change toward a more responsive, business-aligned culture, as well as toward a more service-aware (versus siloed) way of working. So how do you begin? A lot depends, [...]
A few weeks ago, I briefed with a new company called PFP Cybersecurity, also known as Power Fingerprinting, Inc., and was so intrigued by the concept alone that I wrote a Vendor to Watch about them. They officially launched on January 26, , and currently their claim to fame is their physics-based scanning technology which monitors the electromagnetic frequency (EMF) emanations of a microchip while operating. It then compares those readings to either a previous reading or to an established manufacturer’s baseline to determine the state of the chip. There are numerous uses for the technology from supply chain chip counterfeit detection, to operational failure prediction, and most unique of all, malware detection. The scanners are useful in many environments, but especially those that are change and failure/fault intolerant like space vehicles, nuclear and other critical infrastructure environments, and multiple military and natural resource acquisitions environments because they are touchless. There is nothing to install on the system using the microchip, so no change control requests or outage windows are needed. The other interesting thing about their technology is it is disruptive to the current scanner market, costing significantly less than competing products. Their platform is that each model of chip has a different EMF/power signature. These also vary by manufacturer because of variances in raw materials sourcing and manufacturing processes. It is well known that under use conditions, especially when heat dissipation is not well implemented, the chips degrade over time until failure. (That’s the point when the ‘magic smoke’ comes out and it stops working.) The cool part for me was the concept of malware detection. Aside from the physical properties of the chip, the software running on the chip will change the output pattern because of register changes and associated changes in code execution. This means that if a probe is scanning a chip and malware installs itself, the scanner can detect it at the time of installation and alert an operator that it has happened, potentially avoiding larger impact failures and data exfiltration. This technique reminds me of classic side channel attacks on CPUs that perform encryption to attempt a key extraction based on how the various chip leads changed over time. (The key difference being those attacks required contact with the open leads.) In a sense, this technology is highly disruptive, in a positive way, to both the current scanning system suppliers because of the lower cost, and to the infrastructure and supply chains because of not only the cost, but also because of the reported accuracy. It will significantly improve supply chain verification, system reliability, and security. I am looking forward to see how they progress in the marketplace over the next few years. About PFP Cybersecurity Headquartered in Washington, D.C., PFP Cybersecurity provides a unique, anomaly-based cyber security threat detection technology that can instantly identify software and hardware intrusion including active and dormant attacks. With its innovative technology, PFP shortens the compromise detection gap to milliseconds by monitoring changes in electromagnetic frequencies and power usage. This physics-based technology can be applied to detect advanced malware and sophisticated threats in critical cyber systems. It can also detect hardware Trojans and counterfeits in the supply chain. For more information, please visit: www.pfpcyber.com
Have you ever tried to create a major slide presentation on a tablet? Or edit a large spreadsheet? Or write a long document? Probably not. While it’s certainly possible to perform more substantial business tasks on a tablet, the small screen real estate and limited system resources (e.g., processing speed, memory, graphic support, etc.) are typically insufficient in current tablet form factors. However, carrying a laptop around with you everywhere you go just so you can access email is not very practical either. The reality is that we live in a multi-device world where the average worker employs 3 – 5 different computing devices in the regular performance of their job function. . . . and I would argue that’s exactly how it should be. Each user employs the device they prefer to optimally perform tasks at any particular time or place.
Reflecting on my earlier career in IT management, I have to confess to a level of astonishment at how naïve IT administrative practices were just a decade or two ago. Failure events were common, and most organizations just accepted as immutable fact the reality of systemic firefighting. IT services critical to business operations were all too often held together with little more than a hope and a prayer. Sure, my colleagues and I were acutely aware of the importance of performing “root cause analysis” and implementing proactive management practices, but who had the time for that? The inevitability of business pressures, support limitations, and time constraints most often contributed to sustaining a mantra of “just get it working and move on!”