“When experts are wrong, it’s often because they’re experts on an earlier version of the world.” -Paul Graham, co-founder of Y Combinator
Think back to 1999. Do you remember what you were an expert on — Dawson’s Creek, Third Eye Blind, Super Smash Bros? Fast forward to now; is any of that stuff still relevant? As dreamy as Pacey Witter was, the answer is no. This same concept can be applied to how people learn at work and evolve with shifting responsibilities and challenges.
The data from our newest research report, How the Workforce Learns, show that people are learning constantly in personalized formats, independent of formal training. So why do so many L&D professionals insist on keeping dated curricula and traditional training methods?
History can answer that question. Today’s L&D strategies have evolved from an earlier economic vision. They were meant for mass production and workers were rarely given unique opportunities. In short, training was designed with organizations in mind, not end-users. The result was an L&D program that could be scaled but ultimately offered little impact; this is the blueprint that we see across organizations today.
Reviewing this evolution can help us understand where L&D is falling short and how we can make a platform that works just as well for employees as it does for their companies.
1890s-1900s
Customers returned a $50,000 shipment of defective cash registers to the National Cash Register Company in 1894. Hoping to boost worker morale and quality control, executives decided to improve factory conditions and built a company library. This was typical of the “corporate welfare programs” of the time.
But it was not enough. In 1901, NCR workers went on strike, criticizing corporate welfare. In response, NCR formed the nation’s first personnel department, charged with establishing official procedures for choosing who got hired, fired, or promoted. At the same time, they opened an NCR Schoolhouse. With that, NCR had invented what is now known as Human Resources.
1910s-1920s
Not long after pioneering the assembly line, Ford Motor Company formed a Sociological Department — another early version of HR — to help employees learn. Some were skeptical of this investment in “lower-level” workers, but Henry Ford insisted that it was worthwhile: “The only thing worse than training your employees and having them leave is not training them and having them stay.”
Ford offered classes in personal finance, English, homemaking, and hygiene. The company also sent agents to inspect the homes of workers. Many felt some of these efforts were too invasive and controlling. But for Ford, the workforce was a valuable asset that he wanted to monitor closely.
1940s-1950s
AT&T’s Bell Labs laid the foundation for today’s digital economy. The telecommunications company recruited top scientists and brought in massive government funding. As a result, they invented the transistor, the laser, and several computer programming languages.
In some ways, Bell Labs broke the mold. Employees explored their own interests and were encouraged to chat with unfamiliar colleagues who might share a helpful idea. However, only elite researchers had the freedom to explore their interests. The vast majority of AT&T workers were not encouraged (and in fact, not allowed) to engage in this learning program.
1970s-1980s
In 1979, Motorola’s CEO decided that, for the firm to survive, every employee needed new skills. The firm feared rivals from other countries, worried about the competence of its U.S. workforce, and wanted to improve its quality controls. To address these concerns over the next decade, the company built Motorola University.
For this corporate university, Motorola turned hundreds of existing employees and recent retirees into faculty, while also partnering with local community colleges. They pushed every employee to study basic literacy and math, then let go of the handful who were resistant to this new strategy. In 1990, Motorola estimated that its university cost over $100 million annually, but they declared confidently that it was worth the investment.
1990s-2000s
Learning Management Systems originally came from traditional universities, not businesses. The LMS made it possible to deliver learning digitally, reaching people that brick-and-mortar universities never could. The original system was called FirstClass, created by telecom workers in Toronto, who started a company called SoftArc. Their LMS was popularized by the United Kingdom’s Open University in the 1990s.
Businesses quickly caught on, training their workers on similar websites adapted from universities. Critical breakthroughs like SCORM made it possible to track the learning of individual employees. But even in the early 21st century, the LMS was static, without adaptive curricula. The content was standardized for scale, not accounting for individuals’ unique needs and interests.
The Future of L&D
The last 125 years make it clear how far corporate learning has come. These earlier versions of L&D popularized enterprise learning, but none solved the challenges of personalizing skill development — thus making it more impactful.
Today, we have the insight and technology to overcome these challenges, yet the struggles of traditional L&D strategies still hinder companies across industries. With the right direction, business leaders can adopt impactful L&D strategies that put the employees first while maintaining the ability to scale. In our latest report, our research highlights specific pain points for end-users, what they need from their managers and organizations, and how to align employee goals with overall business objectives.
For more insight on how to modernize your L&D strategy and optimize impact, download our report, How the Workforce Learns, below!