Short history of manufacturing: from Industry 1.0 to Industry 4.0
Author: Adrian Dima
Producing various objects is old as mankind. Basically, every product realized by individuals or groups can be assimilated with a basic manufacturing process. During times, humans managed to manufacture objects for their own usage or for commercial purposes, into small workshops and using basic tools.
A first major shift came during the First Industrial Revolution (Industry 1.0) in the 18th century where, instead of items being produced by basic means, processes were invented and allowed items to be produced by machines. This began in England in 1760, then finally reached the United States by the end of the 18th century.
The First Industrial Revolution marked a shift from an agrarian and handicraft economy to one dominated by machinery and significantly impacted industries like mining, textiles, glass and agriculture.
The dramatic reduction of material cost and production time impacted several industries including the textile industry. Prior to this period, textiles were mainly made in people’s homes and merchants would provide the basic equipment and materials needed. This meant workers would make their own schedules, which made it difficult to regulate. Inventions as the steam engine, the spinning wheel, and the water wheel changed the face of manufacturing and set its path to an innovation that is present in our days.
Because the demand was greater than the supply, one major downside of those times it was the pressure on the lower working class. Until 1833, almost no standards existed for workers, which meant long hours and dangerous working conditions, especially for children. This led to the 1833 Factory Act, which placed restrictions on the working hours of children and set standards to protect workers.
The next shift in manufacturing is the period between 1871 and 1914, known as the Second Industrial Revolution (Industry 2.0), as result of extensive railroad and telegraph networks, which allowed for faster transfer of people and ideas. Introduction of electricity allowed factories to develop modern production lines. As fact, the first assembly line was patented in 1901 by Ransom E. Olds, producer of Oldsmobile cars. His method allowed his company to produce 20 units per day, which eventually increased their output by 500 percent in one year. As effect, Oldsmobile was creating more vehicles, allowing a drastic decrease of prices in the same time. The method used by Olds ended up serving as the model for Henry Ford which created his own system. Ford is now credited as the actual father of the assembly line as well as of automotive mass manufacturing.
The Second Industrial Revolution was a period of great economic growth, with an increase in productivity, but also caused a surge in unemployment since many factory workers were replaced by machines.
The Third Industrial Revolution (Industry 3.0), also known as the Digital Revolution, began in the ’70s in the 20th century through partial automation using memory-programmable controls and computers.
The central point of this phase is the mass production and widespread use of digital logic, MOS transistors, and integrated circuit chips, and their derived technologies, including computers, microprocessors, digital cellular phones, and the Internet. These technological innovations have transformed traditional production and business methods. Basically, we can say that the digital revolution converted technology that had been analogue into a digital format.
Is important to mention that Industry 3.0 is still present, most of the factories being at this level of evolution.
Nowadays everybody relates to The Fourth Industrial Revolution, known as Industry 4.0 - a union between physical assets and advanced digital technologies -like Internet of Things (IoT), Artificial Intelligence (AI), robots, drones, autonomous vehicles, 3d printing, cloud computing and others, that are interconnected, having the possibility to communicate, analyse and act. Organizations adopting Industry 4.0 are more flexible, responsive and intelligent, therefore more prepared for data-driven decisions.
Industry 4.0 originated in 2011 from a project in the high-tech strategy of the German government, which promotes the computerization of manufacturing. Actually, the term “Industry 4.0” was publicly introduced in the same year at the Hannover Fair.
There are four design principles identified as integral to Industry 4.0:
- Interconnection — the ability of machines, devices, sensors, and people to connect and communicate with each other via the Internet of things, or the internet of people (IoP)
- Information transparency — the transparency afforded by Industry 4.0 technology provides operators with comprehensive information to make decisions.
- Technical assistance — the technological facility of systems to assist humans in decision-making and problem-solving, and the ability to help humans with difficult or unsafe tasks
- Decentralized decisions — the ability of cyber physical systems to make decisions on their own and to perform their tasks as autonomously as possible.
Since 2011 the focus of technology companies was to include the newest technologies into their products and deliver Industry 4.0 principles into real production. This also is possible due to an unprecedented evolution of technologies, easy to adopt and integrate, shorting the time for creating and launching of new products.