Opportunities in Manufacturing Data Science
The Promise of Big Data
As Travis Korte points out in Data Scientists Should Be the New Factory Workers, big data is paving the way for U.S. manufacturers to stay competitive in a global economy.
Companies like Ford and GM are integrating huge quantities of data – from internal and external sources, from sensors and processors – to reduce energy costs, improve production times and boost profits.
Even smaller businesses are seeing the benefits:
- Big data is cheaper and cheaper to store
- Analytics software is increasingly sophisticated and widespread
- Manufacturers have access to parallel processing machines
In an environment with no room for error, each turn of the screw counts.
Sponsored Online Master's Programs
Learn MoreSyracuse University
* GRE waivers are available.
Learn MoreSouthern Methodist University
* GRE waivers available for experienced applicants
Learn MoreUniversity of Denver
Learn MoreUniversity of California, Berkeley
Learn MoreUniversity of Dayton
Learn MoreAmerican University
Learn MorePepperdine University
On the Factory Floor
Raytheon learned this when they implemented MES (manufacturing execution systems), a software solution that collects and analyzes factory-floor data.
By studying their data, Raytheon were able to determine that a screw in one of the components must be turned thirteen times. If it is turned only twelve times, an error message flashes and installation shuts down.
Like every other manufacturers, Raytheon is benefiting from the fact that complex robotics and automation have replaced humans on the factory floor. Machines embedded with sensors are constantly conveying high-quality data.
When properly parsed by data scientists, this information can be used to:
- Predictively model equipment failure rates
- Streamline inventory management
- Target energy-inefficient components
- Optimize factory floor space
In 2012, Intel saved $3 million in manufacturing costs by using predictive analytics to prioritize its silicon chip inspections.
Internet of Things
And that’s just the internal data. Imagine, if you will, a world where machines bypass humans and speak directly to each other.
- Heating systems consult with weather channels, cell phones and cars to determine when they should fire up the furnace.
- Tractors use data from satellites and ground sensors to decide how to much fertilizer to spread on a certain field.
This is called the Internet of Things, a concept that is already a reality. RFID readers, tags and sensors have become an integral part of manufactured objects, able to relay data to each other at the drop of a hat.
GE is particularly interested in the possibilities. In 2013, it announced that it was more than doubling the vertically-specialized hardware/software packages it offers to connect machines and interpret their data.
The goal is to reduce the amount of unplanned downtime for industrial equipment (e.g., wind turbines) and avoid potential problems (e.g., power grid outages). The trick is going to be ensuring that all of these objects are speaking the same language.
Smart Manufacturing and NNMI
Having suffered right along with manufacturers during the Great Recession, the government is doing its best to help. In 2012, the Obama administration proposed a National Network for Manufacturing Innovation (NNMI), modeled after the Fraunhofer Institutes in Germany.
The proposal calls for series of public/private partnerships between U.S. industry, universities and federal government agencies. These partnerships would be focused on developing and commercializing new manufacturing technologies.
NNMI’s 2012 pilot institute was the National Additive Manufacturing Innovation Institute (NAMII), led by the National Center for Defense Manufacturing and Machining. In 2013, it announced the establishment of three new IMIs, each with a separate focus:
- Digital manufacturing
- Composite materials
- Next-generation energy sources
Getting Past the Hurdles
But the manufacturing sector isn’t breaking out the champagne just yet. Though it’s luckier than some industries – much of its internal data is contained in a relatively structured environment – it still faces a number of challenges:
- Variety: Information, especially unstructured data, is often trapped in organizational “silos.” That means important data is not being shared among departments.
- Volume: Data from human sources (vendors, suppliers, distributors, customers, etc.) and sensor networks (in and outside the factory) are threatening to overwhelm analysts.
- Velocity: Manufacturing supply chains change rapidly in structure and flow. As William Tolone points out, “the more dynamic the data, the more difficult it is to analyze.”
Data scientists also have to be careful not to mistake the trees for the forest. With so many data points available, it’s possible to “discover” correlations and connections that aren’t really there at all.
Data Risks and Regulations
Big data has raised a number of red flags amongst watch dogs. Unlike the EU, the U.S. does not have a single data-protection law. Instead, there’s a hodgepodge of legislation, regulations and self-regulations. As sensors proliferate and the role of big data in manufacturing grows, the questions surrounding information will only grow louder:
- Who owns the rights to the data being collected and examined?
- What responsibilities do manufacturers have regarding sensitive or confidential information?
- Is this sensitive information securely stored?
- What happens when third parties become involved?
- Should manufacturers have an active role in protecting consumer privacy?
In this brave new world, there are no easy answers.
The Role of the Government
Manufacturers must also contend with the reality of post 9/11 industry. The USA Patriot Act gave the government sweeping powers of law enforcement and investigation. At some point, a manufacturer may find itself subject to a higher authority.
For example, picture a scenario in which a criminal has stolen a Ferrari:
- Data from automobile sensors are used to track the suspect’s position.
- The car is instructed to lock the doors and come to a screeching halt.
- It does, narrowly missing a guardrail and smashing into a wall.
It is at this point that the police discover that the manufacturer’s data error has targeted the wrong Ferrari.
This hypothetical case could be expanded to include complicated regulations surrounding the import and export of goods; articles and services related to the U.S. Defense Department; and practically any interaction with the financial industry. Whatever manufacturers do with big data, they must be aware of the consequences.
History of Data Analysis & Manufacturing
“The factory environment is a data scientist’s paradise: both highly multivariate and relatively quantifiable.” – Travis Korte, Data Scientists Should Be New Factory Workers
The U.S. industrial revolution gave birth to a few things: mass production, environmental degradation, the push for workers’ rights… and data science.
No sooner had the first factories gone up than owners were looking for ways to squeeze more efficiency from the production process. It didn’t take long for smart entrepreneurs to realize the power of quantifiable analysis.
The Father of Scientific Management
In the mid-19th century, the young Frederick Winslow Taylor had a problem. Despite his Quaker wealth and considerable brains, his eyesight was poor. Following in his father’s legal footsteps seemed impossible.
But instead of frittering away his inheritance like any self-respecting dilettante, Taylor joined the masses. He became an apprentice pattern-maker and machinist at Philadelphia’s Enterprise Hydraulic Works.
After his promotion to laborer and machinist at Midvale Steel, Taylor began to notice that the machines – and the men who handled them – weren’t working efficiently.
He realized he’d need to back up his observations with hard data. One thing led to another, and the result was – scientific management.
Take his famous time studies. With a stopwatch in hand, Taylor would:
- Choose a factory task
- Break it down into its component parts
- Time each part to a hundredth of a minute
Was the lack of breaks impacting your productivity? Were your shovels the wrong size? Were your workers loafing? Thanks to his data observations, Taylor could tell you which tasks to tweak.
Increasing Output, Decreasing Time-Wasting
So could Frank and Lillian Gilbreth, two of the first management consultants in the business. In their early 20th century, their goal was to help workers (and their employers) increase output and decrease time-wasting tasks.
It was Frank Gilbreth who first proposed that a surgeon be given a “caddy” – someone to fetch the scalpel and sponges – and Gilbreth who came up with the technique that military recruits use for assembling and disassembling their weapons in the dark. He even used a motion picture camera to observe and time a worker’s tiniest motions.
The Gilbreths believed in a never-ending story. Every aspect of the workplace should be constantly questioned, new improvements sought every hour. Their legacy, as it has become known in boardrooms, is continuous quality improvement.
These lessons were not lost on automobile manufacturers. One of the most famous early pioneers was, of course, Henry Ford.
Ford was a believer in the quantifiable:
- Prompted by his employees, he installed moving assembly belts into his Model T plants to speed up production times.
- He introduced rigid specifications and quality criteria on component manufacture (reducing energy and time inputs to manufacturing by 60 to 90 percent in the process).
And, because black paint was the fastest-drying color on the assembly line, he famously stated:
“Any customer can have a car painted any color that he wants so long as it is black.”
Waste Not, Want Not
In Japan, similar developments were afoot. In the 1930s, Kiichiro Toyoda, founder of Toyota, discovered issues with the company’s engine manufacturing process. Too much money was being wasted in repairing poor quality work. Like Taylor, he turned to analysis, meticulously studying each step to identify the hiccups.
Taiichi Ohno, an executive at the company, took this systematic approach even further. With Ford’s work in mind, he developed the Toyota Production System (TPS), the forerunner of lean manufacturing.
His main objectives were to:
- Reduce waste
- Get quality right the first time
- Eliminate inconsistencies and overburdened processes
After visiting U.S. supermarkets, Ohno realized that actual sales – not sales targets – should be driving Toyota’s production line. Toyota began a policy of Pull (build-to-order) rather Push (target-driven) manufacturing. Their profits soared.
Japan’s post-war success shook up the U.S. and started a huge wave of interest in data-driven manufacturing approaches. In 1988, John Krafcik published an article, “Triumph of the Lean Production System,” introducing U.S. colleagues to Toyota’s processes and tools. Today, many folks know it as Lean Manufacturing.
The year was 1981. MTV went on the air with “Video Killed the Radio Star.” AIDS was first identified. And at Motorola, employees were developing a strategy to become known as Six Sigma.
Like Lean Manufacturing, Six Sigma was aimed at eliminating errors, minimizing variability and improving overall quality. Its basis is the idea that every aspect of manufacturing and business processes can be:
What’s more, every Six Sigma project has a quantifiable target. Motorola could develop one to:
- Reduce costs
- Reduce pollution
- Increase customer satisfaction, etc.
In 1995, Jack Welch made Six Sigma a central part of his business strategy at General Electric, prompting a surge in adoption by other companies. The term is now ubiquitous in business and industry.