Thursday, June 29, 2017

Kaizen and the Philosophy of Continual Improvement

kaizen
One of the mostly well-known and widely used philosophies of continual improvement originated in Japan. It is known by the name, kaizen, which translates approximately to “good change”. Kaizen has been employed in a wide range of industries – healthcare, banking, psychotherapy, government, and many others. In business, kaizen typically refers to activities that continually improve all business functions and involve all employees.
Kaizen is frequently used to optimize purchasing, logistics, and supply chain processes, and has been employed in lean manufacturing processes to help eliminate waste. Kaizen was first used by Japanese businesses following World War II, and has since spread throughout the world and been implemented in environments outside of business and productivity.
Kaizen places a strong emphasis on employee feedback, encouraging employees at every level to apply the scientific method in learning how to spot and eliminate waste in business processes. Kaizen can be applied in a very small, personalized way, or it can apply to larger processes that involve groups of employees. In a very general way, the Kaizen methodology can be understood as:
  1. Discovering opportunities for small adjustments based on process data and customer feedback
  2. Implementing these small changes incrementally
  3. Monitoring the results of each individual adjustment for a certain period of time
  4. Using the new data to make adjustments
  5. Defining the results of successful adjustments as standards, and using these standards as baselines for additional improvements
  6. Repeating this cycle indefinitely
circle
The kaizen philosophy aims to improve process efficiency, quality, and safety by making it easier for employees to do their jobs well and with confidence – rather than expecting them to work harder through incentives or fear of replacement.
Improvements made using the kaizen philosophy are typically on a much smaller scale than those found in the “command and control” improvement programs popularized in the mid-twentieth century.
This system of incrementally improving operations is also known the Shewhart Cycle, Deming Cycle, or PDCA (Plan-Do-Check-Act).
Similar ideas are investigated in the realms of Organizational Development (OD) or Business Process Improvement (BPI). The general intent of all of these philosophies is the same: to maximize the value of all available material, personal, and intellectual assets and to improve business processes by making use of resources that are already available.
Like the methods outlined above, other popular methods like Six Sigma, Lean, and Total Quality Management emphasize employee involvement and collaboration, standardizing processes, and reducing variations, defects and cycle times.
Excerpted from the whitepaper “Continual Improvement with Status”, downloaded at www.scada.com.

Monday, June 26, 2017

4 Common Obstacles Between Your Enterprise and the IoT

roadClosed

It should come as no surprise that most companies today have some sort of IoT initiative being discussed, planned, or developed – if not already implemented. And this phenomenon is global and completely horizontal. The early adopters of IoT are already seeing positive returns, and the march of progress is overwhelming if not inevitable.

Why Aren’t We All There Yet?

For those still planning their IoT initiatives and smoothing out the details, there are several barriers that can get in the way. Some of the most commonly cited in surveys include: security concerns, difficulty quantifying ROI to CEOs, concerns about compatibility with existing data systems, and concerns about the technical skills of the staff to implement such strategies.

Obstacle 1 – Increased Exposure of Data/Information Security

As could be expected, security is the almost always biggest concern in most organizations. With the World Wide Web as an example, people today are fully aware of the dangers inherent in transmitting data between nodes on a network. With many of these organizations working with key proprietary operational data that could prove advantageous to a competitor if exposed, the concern is very understandable.

Obstacle 2 – Proving ROI/Making the Business Case

This is a classic example of not knowing what you don’t know. Without an established example of how similar initiatives have impacted your organization in the past – or even how similarly sized and structured organizations have been impacted – it can be very difficult to demonstrate in a tangible way exactly how these efforts will impact the bottom line. Without being able to make the business case, it will be difficult for executives to sign off any new initiatives. This is likely why larger organizations ($5+ billion in annual revenue) are much more likely to have already implemented IoT initiatives, while smaller organizations are still in the planning phase.

Obstacle 3 – Interoperability with Current Infrastructure/Systems

Nobody likes to start over, and many of the executives surveyed are dealing with organizations who have made enormous investments in the technology they are currently using. The notion of a “rip and replace” type of implementation is not very appealing. The cost is not only related to the downtime incurred in these cases, but the wasted cost associated with the expensive equipment and software systems that are being cast aside. In most cases, to gain any traction at all a proposed IoT initiative will have to work with the systems that are already in place – not replace them.

Obstacle 4 – Finding the Right Staff/Skill Sets for IoT Strategy and Implementation

With the IoT still being a fairly young concept, many organizations are concerned that they lack the technical expertise needed to properly plan and implement an IoT initiative. There are many discussions taking place about how much can be handled by internal staff and how much may need to be out-sourced. Without confidence in their internal capabilities, it is also difficult to know whether they even have a valid strategy or understanding of the possibilities. Again, this is a case where larger organizations with larger pools of talent have an advantage.
There are some valid concerns, and not all of them lend themselves to simple solutions. In truth, many of the solutions will vary from one organization to the next. However, in many cases the solutions could be as simple as just choosing the right software platform. Finding a platform that eases your concerns about interoperability can also help ease your concerns about whether your staff can handle the change, as there will be no need to replace equipment. Likewise, a platform that can be integrated seamlessly into your current operations to help improve efficiency and implement optimization strategies will also make it much easier to demonstrate ROI.
Excerpted from the whitepaper “Choosing the Right IoT Platform”, downloaded at www.scada.com.

Tuesday, May 16, 2017

3 Keys to Effective Real-Time Data Visualization

There are several important factors to consider when creating your real-time data visualization, many of which will depend on your intended application. Today, we consider at a few of the general factors that will play a role in every visualization you create. These three factors are clarity, consistency, and feedback.
Clarity
Real-Time graphics should emphasize pertinent information and use design principles that promote ease-of-use and accessibility above aesthetics. Things like size, color and brightness can be used to distinguish primary details from secondary and tertiary details. Special graphics can be created to emphasize different information under different conditions (i.e. a special set of graphics to be used when a certain alarm is triggered).
clarityPic
When planning a real-time visualization scenario, it is very important to consider who will be using this visualization, and what is his/her purpose in viewing this data. This will obviously vary from one organization to the next, but when differentiating between primary, secondary, and tertiary information, it is important to not think in terms of what is important about the thing being monitored, but what is important to the person doing the monitoring.
Consistency
Consistent visualizations are standardized and consistently formatted. Interaction requires a minimum of keystrokes or pointer manipulations. In fact, whenever possible, all relevant information should be visible without the need to navigate to another screen. When navigation is necessary, be certain that elements of the user interface related to navigation are clearly distinguished from elements that relay pertinent information. Additionally, navigation and interaction of any type should be as easy and intuitive as possible.
chart
The ergonomic needs of the user are also extremely important. Poor data visibility has been cited as a primary cause of many industrial accidents where a process was being monitored or controlled through a real-time HMI (Human Machine Interface). In fact, poorly designed HMIs have been blamed for accidents that have led to millions of dollars in damaged equipment and some very unfortunate and unnecessary deaths.

A recent study by OSHA in Europe compiled statistics on HMI-related errors in the workplace. Interestingly, research shows that the majority of problems are caused by human error, but not entirely because of mental and physical fatigue. More often, errors are caused by poor decision-making related to the way that information is processed.
  
Feedback
An operator should be fully confident that the choices they make are having the desired effect. Screens should be designed in a way that provides information, putting relevant data in the proper context. Also, important actions that carry significant consequences should have confirmation mechanisms to ensure that they are not activated inadvertently.
 Controls will function consistently in all situations. If something is not working as it should, that fact should be immediately obvious and undeniable. Again, in a well-designed system, design principles are employed to promote clarity and simplicity, and to reduce user fatigue.
Keep it simple and straight-forward. Save the complex visual tools for historical data or real-time reporting. There is certainly a place for all of this, but that place is not where real-time data is being used to make real-time decisions.
Learn more in the free whitepaper “Real-Time Data Visualization Essentials”:

wpCover

Tuesday, May 3, 2016

Is the Internet of Things Really Happening?

network-782707_960_720.png
Over the last few years there has been much speculation about the inevitable growth of the Internet of Things (or Internet of Everything). Forecasts have suggested anywhere from 30 to 50 billion devices will be connected by 2020. Cisco has estimated that the global IoT ecosystem will have a value of $14.4 trillion by 2022, and IDC has projected yearly IoT market revenue to increase to $1.7 trillion by 2020. 

Here we are now in 2016, a few years into the future they were talking about back then, and it may be a good time to take a look the current state of the IoT and see how it measures up to all of these lofty expectations. Are people really embracing IoT technology at this rate? Is this money really being invested? 

connectedDevices.png

Connected Devices
First, let’s take a look at the number of connected devices. If we flash back to 2013, we find that Gartner released a report entitled “Forecast: The Internet of Things, Worldwide, 2013”. In this report, they predicted that the IoT will include 26 billion connected devices by 2020. Two years later, Gartner reported a total of 4.9 billion connected devices at the end of 2015, up from 3.8 billion in 2014. Gartner also revised their 2020 estimate, anticipating 20.7 billion connected devices by 2020, a decrease of 5.3 billion (20.4%) from their 2013 estimate. (It should be noted here that Cisco continues to anticipate as many as 50 billion by 2020).

So, according to Gartner, IoT adoption has not proceeded at the rate they had anticipated at the end of 2013.
One reason for the slower-than-expected growth is the difficulty faced when trying to implement IoT technology. In fact, Gartner anticipates that through 2018, 75% of IoT projects will take up to twice as long as planned. 


Value of the IoT
Now, let’s consider the monetary value of the IoT and how that number has progressed. Cisco initially projected a value of $14.4 trillion by 2022. Within two years Cisco had increased this number to $19 trillion.
value.png


This highlights an interesting fact. Even though fewer connected devices are expected by this date, the total value of these devices and the underlying network is expected to be greater than it was when more devices were expected. Based on this, I think it’s safe to suggest that implementing IoT technology is turning out to be more expensive than originally thought. 

This may be due in part to the fact that some enterprises are rushing headlong into IoT projects without the proper foresight and planning. Often it is a reaction to competitive pressure, based on a perception that a competitor is already moving forward with their IoT strategy, or simply in an effort to be the first and gain a competitive edge.

“I think it’s safe to suggest that implementing IoT technology is turning out to be more expensive than originally thought.”


Another answer may come from Gartner’s 2015 report: “Predicts 2015: The Internet of Things”, in which Gartner predicts that through 2018, there will be “no dominant IoT ecosystem platform”. They cite a lack of IoT standards and anticipate that IT leaders will be forced to compose solutions from multiple providers.

Read our White Paper on Choosing the Right IoT Software Platform


IoTPlatformWP_cover.png






Even when faced with these realities, however, enterprises are still moving forward with their IoT projects. The extra expense – though unanticipated – is not nearly enough to outweigh the potential benefits. The IoT is most certainly transforming the way businesses operate, and no one wants to be the last one to this dance.


IoT Investment
This is an important category as it will largely determine how quickly the industry moves to develop standards, and how motivated IoT solution providers will be to develop more powerful and more cost-effective solutions.

Recall IDC’s projection of annual market revenue reaching $1.7 trillion by 2020. It would stand to reason that if we are learning that IoT projects are coming in over budget and late, there is probably some distaste in the marketplace, and maybe IDC’s projection was a bit ambitious.
At the same time, though, if people are spending more on IoT initiatives than they had originally planned, perhaps IDC’s projection was a bit conservative. Let’s examine how things are taking shape.
In 2015, IDC reported that worldwide IoT spending reached $655.8 billion in 2014 and calculated a 16.9% CAGR (Compound Annual Growth Rate).
Well, 2015 is now in the books and we can see how IDC’s projections seem to be holding up. Their latest report indicates that spending in 2015 reached $698.6 billion, a CAGR over 2014 of only 6.53%. Had IDC’s anticipated CAGR proven accurate, 2015 revenue should have been closer to $766 billion.
Notwithstanding this fact, however, IDC continues to project a CAGR of 17% and an increase in spending to $1.3 billion by 2019, which would equal approximately $1.5 billion in 2020. It looks like IDC sees the IoT market cooling off a bit, though not much.
revenue.png

So, while the earlier projection has proven to be overly optimistic, it is clear that investments in IoT initiatives are continuing to increase with no end in sight.

If there is any kind of meaningful takeaway from all of this, I think it’s safe to surmise that IoT projects may be coming in late and over budget, but that doesn’t seem to have had much of an impact on continued investments. It is clear that business owners and executives see the value and have no interest in letting their competitor’s gain an edge. 
So, was the IoT hyped a bit excessively over the last couple of years? Maybe a bit. But, it is also very real and happening right now.

Friday, April 15, 2016

Choosing the Right Maintenance Strategy

iStock_000015422331Large.jpg
How do you choose the right maintenance strategy for your organization? Someone from the outside looking in might think the notion of choosing a maintenance strategy is as simple as choosing between ‘repair it’ or ‘replace it’, and that’s not entirely inaccurate. Beyond the surface, though, there are a number of different considerations that can have a long-term impact on a company’s bottom line and ultimate viability. Particularly when working with numerous or expensive essential assets that are subject to the continual wear-and-tear and eventual breakdown that plagues all machines, maintenance costs can take enormous bites out of revenue.
Fortunately, numerous maintenance strategies have evolved over the years, and technology allows us to apply new techniques using new models that were previously unheard of. Let’s review some of the more popular maintenance strategies:
Reactive Maintenance
This is the simplest strategy, sometimes referred to as ‘breakdown maintenance’. The premise is simple: Use something until it can no longer be used. Then, do what needs to be to repair it and get it back in action. If it can’t be repaired, replace it. There are some benefits when compared to other strategies, such as lower initial costs and reduced staff, as well as eliminating the need to plan. Of course, these benefits are usually negated in the long term by unplanned downtime, shortened life expectancy of assets, and a complete inability to predict breakdowns and maintenance needs. The only real viable reason for employing this strategy is an inability to afford the initial costs of any other strategy.

Preventative Maintenance
Preventative maintenance is performed while an asset is still operational in order to decrease the likelihood of failure. In this strategy, maintenance is performed according to a particular time or usage schedule. For instance, regular maintenance will be performed when this particular machine reaches 5,000 hours of uptime since the last maintenance. Predictive maintenance will typically keep equipment operating with greater efficiency and extend the lifetime of the asset compared to reactive maintenance, while also preventing unnecessary downtime. It does, however, require greater planning and man-power. Preventative maintenance is not a good choice for assets like circuit boards that can fail randomly regardless of maintenance. It is also not ideal for assets that do not serve a critical function and will not cause downtime in the event of a failure.

Predictive Maintenance
The purpose of predictive maintenance is to predict an imminent failure and perform maintenance before it occurs. This strategy requires some specific condition monitoring and will typically have a higher upfront cost due to the need to add sensors or other hardware, and will also require skilled personnel capable of anticipating failures based on the data points being monitored. Benefits include: the ability to prevent unnecessary downtime, and minimal time spent performing maintenance as it is only done when failure is imminent. Predictive maintenance is usually not a good option for assets that do not serve a critical function, or assets that do not have a predictable failure mode.

Condition-Based MaintenanceCondition-based maintenance is similar to predictive maintenance in that it involves continually monitoring specific conditions to determine when maintenance should be performed. Typically, however, condition-based maintenance is not just performed to prevent failure, but also to ensure optimum efficiency, which can not only improve productivity but extend the life of the asset as well. Because condition monitoring equipment and expertise can be expensive, initial costs can be quite high – prohibitive in some cases. In the long term, however, condition-based maintenance may be the most cost-effective strategy for ensuring optimal productivity and extended asset lifecycles. Condition-based maintenance is usually not a good choice for non-critical assets or older assets that may be difficult to retrofit with sensors.

When choosing a maintenance strategy, think about your goals: both long-term and short-term. Determine which of your assets are critical and which are not. Calculate the cost of downtime (per minute, per hour, etc.). Take into account whatever data may already be available for you to monitor. Determine the cost and viability of adding sensors to monitor things like temperature, vibration, electric currents, subsurface defects (ultrasonic sensing), or vacuum leaks (acoustic sensing). Estimate the costs of maintenance personnel in different scenarios. Estimate the difference in costs between each of the different strategies.
You may determine that a condition-based maintenance program would provide the greatest value, but you lack the resources to implement it right away. Can you deploy a simple predictive maintenance program in the meantime, while positioning yourself to make the leap to CBM in the future?
There is not going to be any one-size-fits-all “best” strategy, and not much drains a bank account faster than over-maintaining your equipment (yes, there is such a thing). Consider your circumstances and your goals, and choose wisely. It’s one of the most important business decisions you will make.

Thursday, April 7, 2016

3 Reasons Modern Farmers Are Adopting IoT Technology at an Astounding Rate

5cd5875f-9f60-4576-8499-a0e5a0552d80.jpg
It seems like everything today is touched in some way by the Internet of Things. It is changing the way goods are produced, the way they are marketed, and the way they are consumed. A great deal of the IoT conversation has revolved around transformation in industries like manufacturing, petrochemical, and medicine, but one industry that has already seen widespread adoption of IoT technology is often overlooked: agriculture.
Of course, many of us are very familiar with some of the efforts that have been made to optimize food production. As populations continue to grow, there has been a serious and sustained drive to increase the crop yield from our available arable land. Some of these efforts have not been particularly popular with consumers (i.e. pesticides, GMOs).
With the advent of new technology and the Internet of Things, farmers are finding new ways to improve their yields. Fortunately for us, these new ways are decidedly less disturbing than toxic chemicals and genetic manipulation. Using sensors and networked communication, farmers are discovering ways to optimize already-known best practices to increase yield and reduce resource consumption.
If it’s surprising that the agricultural industry would be technological innovators, it’s worth considering how agriculture is in many ways an ideal testbed for new technology.  
There are a few good reasons for this:

1. Ease of Deployment
Unlike in other industries, deploying sensors and other connected devices on a farm can be relatively easy and inexpensive. In a heavy industrial environment like a factory or refinery, new technology must replace old technology that is thoroughly embedded in the production infrastructure. There are concerns about downtime and lost revenue, as well as concerns about finding the right products or group of products to integrate into their existing technological ecosystem. On a typical farm, there is no need for downtime, and usually no concern for any existing technology that may be incompatible. Inexpensive sensors placed in various parts of a cultivated field can quickly yield very useful actionable data without disrupting a single process. 

2. Instant Value
Another reason that agriculture has provided such a fertile testbed for IoT technology is the speed with value and ROI can be realized. Pre-existing metrics of precision agriculture can be applied more easily, maximizing the already-known benefits of established practices (knowing what types of crops to plant when, knowing when and how much to water, etc.). Farmers have also had success safely and naturally controlling pests through the intelligent release of pheremones. Of course, there is the obvious and very tangible benefit of decreased resource consumption and increased yield. A modest investment can yield measurable results within a single season. 

3. Continual value
In agricultural IoT deployments, the same practices that provide instant value will continue to provide value for as long as they are employed. Conservation of water and waste reduction provide repeated value, as well as the increased yield brought on by precision farming. There are also opportunities to improve the equipment that farmers use every day. A connected combine or tractor can record useful information about its operation and maintenance. It can also allow for certain processes to be optimized and automated.

There are some real concerns about our ability to feed our ever-growing population in the future. While controversial technologies like genetically-modified-organisms have helped to increase food production, these techniques are not exactly popular with the general public, several of whom have voiced concerns about the long-term impact of a genetically-modified diet.
The good news is that similar increases in food production are possible without the need to modify the food; we simply have to modify the processes used to produce it. And it’s not just about food production. Plants are also used for biofuels and as raw materials in manufacturing. By increasing yield and reducing resource consumption, growers are also having a positive impact on numerous other industries.
For instance, a Colorado-based company called Algae Lab Systems is helping algae farmers improve their output by introducing sensors to measure environmental factors like temperature, pH, and dissolved oxygen in their photobioreactors and algae ponds. Algae growers are now able to continuously monitor their crops from any location, also allowing for larger and geographically dispersed operations.
A case study detailing Algae Lab Systems provides some insight into how they are transforming the algae farming industry, and aquaculture in general.

Monday, April 4, 2016

To Each His Own: Creating Custom Dashboards for Operators and Analysts

manyFaces
It’s always very annoying when I try to perform what seems like it would be fairly routine maintenance on a home appliance or worse – my car – only to find out that this seemingly simple thing I would like to do is actually quite difficult with the tools at my disposal. A little bit of research usually reveals that it actually is quite simple; I just have to buy this proprietary tool from the manufacturer for what seems like a ridiculous price, and then I can proceed.
Of course, it’s easy to understand why the manufacturer doesn’t want to make it easy for end users to service their product. They want you to buy a new one, or at the very least buy this overpriced tool from them so they can scrape every morsel of profit afforded by their built-in obsolescence.
It really makes me appreciate the simplicity and widespread application of some of our more traditional tools. Take a hammer, for instance. If you need to drive a nail into wood, it doesn’t matter if it’s a big nail, a little nail, a long nail, or a short nail. It doesn’t matter who manufactured it or when. All that matters is that it’s a nail. Just get a hammer; you’ll be fine.
This got me thinking. What if we had a hammer for every type of nail available? What if each hammer was perfectly sized, shaped, weighted and balanced for each particular nail? And what if that perfect hammer was always available to you every time you needed it. This isn’t realistic, obviously, but it reminds me of some of the things I hear from our customers.
One of the great benefits cited by our end users is the ability to create custom dashboards for the different work responsibilities in their organizations. The same system is used to create maintenance dashboards for technicians, control panels for operators, system overviews for managers, reports for analysts, and even special dashboards for contractors and vendors. By providing every member of the team with a real-time view of exactly the information they need to do their jobs and nothing more, each person is empowered to do their jobs with the utmost efficiency – improving the speed and accuracy of decision-making as well as increasing the capacity for planning.
In the past, so much of our data visualization was tied to the device from which the data was drawn. If you wanted to know something about a particular machine, you had to look at the same picture as everyone else, regardless of what you needed to see.
Some modern software platforms like B-Scada’s Status products eliminate this need to tie visualizations to the device from which the data is drawn. It is now possible to visualize data from multiple devices at multiple locations through the same interface. This allows for a new concept in user interface design: rather than displaying all available information about this particular thing, you can now display all information relevant to a particular task or set of tasks.  
It’s not quite “a hammer for every nail”; it’s more like a complete tool set tailored to every job, containing exactly the tools you need and nothing more. It’s really been a transformative development for many organizations.
B-Scada recently released a case study detailing how one prominent North American electric utility used Status to create a system of customized views for their operators, managers, and analysts, providing specific insights into the real-time status of their generation resources:
Read It Now


**B-Scada specializes in data acquisition and visualization solutions, and has developed custom user interfaces for customers in various industries around the world. Learn more at www.scada.com