This post is the Transcript: Intro to Human Factors.
In the 40-minute video, I’m joined by a friend, colleague and Human Factors specialist, Peter Benda. Peter has 23 years of experience in applying Human Factors to large projects in all kinds of domains. In this session we look at some fundamentals: what does Human Factors engineering aim to achieve? Why do it? And what sort of tools and techniques are useful? As this is The Safety Artisan, we also discuss some real-world examples of how Human Factors can contribute to accidents or help to prevent them.
Transcript: Intro to Human Factors
Simon: Hello, everyone, and welcome to the Safety Artisan: Home of Safety Engineering Training. I’m Simon and I’m your host, as always. But today we are going to be joined by a guest, a Human Factors specialist, a colleague, and a friend of mine called Peter Benda. Now, Peter started as one of us, an ordinary engineer, but unusually, perhaps for an engineer, he decided he didn’t like engineering without people in it. He liked the social aspects and the human aspects and so he began to specialize in that area. And today, after twenty-three years in the business, and first degree and a master’s degree in engineering with a Human Factors speciality. He’s going to join us and share his expertise with us.
So that’s how you got into it then, Peter. For those of us who aren’t really familiar with Human Factors, how would you describe it to a beginner?
Peter: Well, I would say it’s The Joint Optimization Of Human-Machine Systems. So it’s really focusing on designing systems, perhaps help holistically would be a term that could be used, where we’re looking at optimizing the human element as well as the machine element. And the interaction between the two. So that’s really the key to Human Factors. And, of course, there are many dimensions from there; environmental, organizational, job factors, human and individual characteristics. All of these influence behaviour at work and health and safety. Another way to think about it is the application of scientific information concerning humans to the design of systems. Systems are for human use, which I think most systems are.
Simon: Indeed. Otherwise, why would humans build them?
Peter: That’s right. Generally speaking, sure.
Simon: So, given that this is a thing that people do then. Perhaps we’re not so good at including the human unless we think about it specifically?
Peter: I think that’s fairly accurate. I would say that if you look across industries, and industries are perhaps better at integrating Human Factors, considerations or Human Factors into the design lifecycle, that they have had to do so because of the accidents that have occurred in the past. You could probably say this about safety engineering as well, right?
Simon: And this is true, yes.
Peter: In a sense, you do it because you have to because the implications of not doing it are quite significant. However, I would say the upshot, if you look at some of the evidence –and you see this also across software design and non-safety critical industries or systems –that taking into account human considerations early in the design process typically ends up in better system performance. You might have more usable systems, for example. Apple would be an example of a company that puts a lot of focus into human-computer interaction and optimizing the interface between humans and their technologies and ensuring that you can walk up and use it fairly easily. Now as time goes on, one can argue how out how well Apple is doing something like that, but they were certainly very well known for taking that approach.
Simon: And reaped the benefits accordingly and became, I think, they were the world’s number one company for a while.
Peter: That’s right. That’s right.
Simon: So, thinking about the, “So why do it?” What is one of the benefits of doing Human Factors well?
Peter: Multiple benefits, I would say. Clearly, safety and safety-critical systems, like health and safety; Performance, so system performance; Efficiency and so forth. Job satisfaction and that has repercussions that go back into, broadly speaking, that society. If you have meaningful work that has other repercussions and that’s sort of the angle I originally came into all of this from. But, you know, you could be looking at just the safety and efficiency aspects.
Simon: You mentioned meaningful work: is that what attracted you to it?
Peter: Absolutely. Absolutely. Yes. Yes, like I said I had a keen interest in the sociology of work and looking at work organization. Then, for my master’s degree, I looked at lean production, which is the Toyota approach to producing vehicles. I looked at multiskilled teams and multiskilling and job satisfaction. Then looking at stress indicators and so forth versus mass production systems. So that’s really the angle I came into this. If you look at it, mass production lines where a person is doing the same job over and over, it’s quite repetitive and very narrow, versus the more Japanese style lean production. There are certainly repercussions, both socially and individually, from a psychological health perspective.
Simon: So, you get happy workers and more contented workers-
Peter: –And better quality, yeah.
Simon: And again, you mentioned Toyota. Another giant company that’s presumably grown partly through applying these principles.
Peter: Well, they’re famous for quality, aren’t they? Famous for reliable, high-quality cars that go on forever. I mean, when I moved from Canada to Australia, Toyota has a very, very strong history here with the Land Cruiser, and the high locks, and so forth.
Simon: All very well-known brands here. Household names.
Peter: Are known to be bombproof and can outlast any other vehicle. And the lean production system certainly has, I would say, quite a bit of responsibility for the production of these high-quality cars.
Simon: So, we’ve spoken about how you got into it and “What is it?” and “Why do it?” I suppose, as we’ve said, what it is in very general terms but I suspect a lot of people listening will want to know to define what it is, what Human Factors is, based on doing it. On how you do it. It’s a long, long time since I did my Human Factors training. Just one module in my masters, so could you take me through what Human Factors involves these days in broad terms.
Peter: Sure, I actually have a few slides that might be useful –
Simon: – Oh terrific! –
Peter: –maybe I should present that. So, let me see how well I can share this. And of course, sometimes the problem is I’ll make sure that – maybe screen two is the best way to share it. Can you see that OK?
Simon: Yeah, that’s great.
Introduction to Human Factors
Peter: Intro to Human Factor. So, as Stewart Dickinson, who I work with at human risk solutions and I have prepared some material for some courses we taught to industry. I’ve some other material and I’ll just flip to some of the key slides going through “What is Human Factors”. So, let me try to get this working and I’ll just flip through quickly.
Definitions of Human Factors
Peter: So, as I’ve mentioned already, broadly speaking, environmental, organizational, and job factors, and human individual characteristics which influence behaviour at work in a way that can which can affect health and safety. That’s a focus of Human Factors. Or the application of scientific information concerning humans to the design of objects, systems and environments for human use. You see a pattern here, fitting work to the worker. The term ergonomics is used interchangeably with Human Factors. It also depends on the country you learn this in or applied in.
Simon: Yes. In the U.K., I would be used to using the term ergonomics to describe something much narrower than Human Factors but in Australia, we seem to use the two terms as though they are the same.
Peter: It does vary. You can say physical ergonomics and I think that would typically represent when people think of ergonomics, they think of the workstation design. So, sitting at their desk, heights of tables or desks, and reach, and so on. And particularly given the COVID situation, there are so many people sitting at their desks are probably getting some repetitive strain –
Simon: –As we are now in our COVID 19 [wo]man caves.
Peter: That’s right! So that’s certainly an aspect of Human Factors work because that’s looking at the interaction between the human and the desk/workstation system, so to speak, on a very physical level.
But of course, you have cognitive ergonomics as well, which looks of perceptual and cognitive aspects of that work. So Human Factors or ergonomics, broadly speaking, would be looking at these multi-dimensional facets of human interaction with systems.
Definitions of Human Factors (2)
Peter: Some other examples might be the application of knowledge of human capabilities and limitations to design, operation and maintenance of technological systems, and I’ve got a little distilled –or summarized- bit on the right here. The Human Factors apply scientific knowledge to the development and management of the interfaces between humans and rail systems. So, this is obviously in the rail context so you’re, broadly speaking, talking in terms of technological systems. That covers all of the people issues. We need to consider to assure safe and effective systems or organizations.
Again, this is very broad. Engineers often don’t like these broad topics or broad approaches. I’m an engineer, I learned this through engineering which is a bit different than how some people get into Human Factors.
Simon: Yeah, I’ve met a lot of human factor specialists who come in from a first degree in psychology.
Peter: That’s right. I’d say that’s fairly common, particularly in Australia and the UK. Although, I know that you could take it here in Australia in some of the engineering schools, but it’s fairly rare. There’s an aviation Human Factors program, I think, at Swinburne University. They used to teach it through mechanical engineering there as well. I did a bit of teaching into that and I’m not across all of the universities in Australia, but there are a few. I think the University of the Sunshine Coast has quite a significant group at the moment that’s come from, or, had some connection to Monash before that. Well, I think about, when I’m doing this work, of “What existing evidence do we have?” Or existing knowledge base with respect to the human interactions with the system. For example, working with a rail transport operator, they will already have a history of incidents or history of issues and we’d be looking to improve perhaps performance or reduce the risk associated with the use of certain systems. Really focusing on some of the evidence that exists either already in the organization or that’s out there in the public domain, through research papers and studies and accident analyses and so forth. I think much like safety engineering, there would be some or quite a few similarities in terms of the evidence base –
Simon: – Indeed.
Peter: – Or creating that evidence through analysis. So, using some analytical techniques, various Human Factors methods and that’s where Human Factors sort of comes into its own. It’s a suite of methods that are very different from what you would find in other disciplines.
Simon: Sure, sure. So, can you give us an overview of these methods, Peter?
Peter: There are trying to think of a slide for this. Hopefully, I do.
Simon: Oh, sorry. Have I taken you out of sequence?
Peter: No, no. Not out of sequence. Let me just flip through, and take a look at –
The Long Arm of Human Factors
Peter: This is probably a good sort of overview of the span of Human Factors, and then we can talk about the sorts of methods that are used for each of these – let’s call them –dimensions. So, we have what’s called the long arm of Human Factors. It’s a large range of activities from the very sort of, as we’re talking about, physical ergonomics, e.g. sitting at a desk and so on, manual handling, workplace design, and moving to interface design with respect to human-machine interfaces- HMIs, as they’re called, or user interfaces. There are techniques, manual handling techniques and analysis techniques – You might be using something like a task analysis combined with a NIOSH lifting equation and so on. Workplace design, you’d be looking at anthropocentric data. So, you would have a dataset that’s hopefully representative of the population you’re designing for, and you may have quite specific populations. So Human Factors, engineering is fairly extensively used, I would say, in military projects –in the military context-
Simon: – Yes.
Peter: – And there’s this set of standards, the Mil standard, 1472G, for example, from the United States. It’s a great example that gives not only manual handling standards or guidelines, workplace design guidelines in the workplace, in a military sense, can be a vehicle or on a ship and so on. Or on a base and so forth.
Interface design- So, if you’re looking at from a methods perspective, you might have usability evaluations, for example. You might do workload’s studies and so forth, looking at how well the interface supports particular tasks or achieving certain goals.
Human error –There are human error methods that typically leverage off of task models. So, you’d have a task model and you would look at for that particular task, what sorts of errors could occur and the structured methods for that?
Simon: Yes, I remember human task analysis –seeing colleagues use that on a project I was working on. It seemed quite powerful for capturing these things.
Peter: It is and you have to pragmatically choose the level of analysis because you could go down to a very granular level of detail. But that may not be useful, depending on the sort of system design you’re doing, the amount of money you have, and how critical the task is. So, you might have a significantly safety-critical task, and that might need quite a detailed analysis. An example there would be – there was a … I think it’s the … You can look up the accident analysis online, I believe it’s the Virgin Galactic test flight. So this is one of these test flights in the U.S. – I have somewhere in my archive of accident analyses – where the FAA had approved the test flights to go ahead and there was a task where – I hope I don’t get this completely wrong – where one of the pilots (there are two pilots, a pilot and a co-pilot) and this test aeroplane where they had to go into high-altitude in this near-space vehicle. They were moving at quite a high speed and there was a particular task where they had to do something with – I think they had to slow down and then you could … slow down their aeroplane, I guess, by reducing the throttle and then at a certain point/a certain speed, you could deploy, or control, the ailerons or some such, wing-based device, and the task order was very important. And what had happened was a pilot or the co-pilot had performed the task slightly out of order. As a matter of doing one thing first before they did another thing that led to the plane breaking up. And fortunately, one of the pilots survived, unfortunately, one didn’t.
Simon: So, very severe results from making a relatively small mistake.
Peter: So that’s a task order error, which is very easy to do. And if the system had been designed in a way to prevent that sort of capability to execute that action at that point. That would have been a safer design. At that level, you might be going down to that level of analysis and kind of you get called keystroke level analysis and so on
Simon: – Where it’s justified, yes.
Peter: Task analysis is, I think, probably one of the most common tools used. You also have workload analysis, so looking at, for example, interface design. I know some of the projects we were working on together, Simon, workload was a consideration. There are different ways to measure workload. There’s a NASA TLX, which is a subjective workload. Questionnaire essentially, that’s done post-task but it’s been shown to be quite reliable and valid as well. So, that instrument is used and there are a few others that are used. It depends on the sort of study you’re doing, the amount of time you have and so forth. Let me think, that’s workload analysis.
Safety culture- I wouldn’t say that’s my forte. I’ve done a bit of work on safety culture, but that’s more organizational and the methods there tend to be more around culpability models and implementing those into the organizational culture.
Simon: So, more governance type issues? That type of thing?
Peter: Yes. Governance and – whoops! Sorry, I didn’t mean to do that. I’m just looking at the systems and procedure design. The ‘e’ is white so it looks like it’s a misspelling there. So it’s annoying me …
Simon: – No problem!
Peter: Yes. So, there are models I’ve worked with at organization such as some rail organizations where they look at governance, but also in terms of appropriate interventions. So, if there’s an incident, what sort of intervention is appropriate? So, essentially use sort of a model of culpability and human error and then overlay that or use that as a lens upon which to analyse the incident. Then appropriately either train employees or management and so on. Or perhaps it was a form of violation, a willful violation, as it may be –
Simon: – Of procedure?
Peter: Yeah, of procedure and so on versus a human error that was encouraged by the system’s design. So, you shouldn’t be punishing, let’s say, a train driver for a SPAD if the –
Simon: – Sorry, that’s a Signal Passed At Danger, isn’t it?
Peter: That’s right. Signal Passed At Danger. So, it’s certainly possible that the way the signalling is set up leads to a higher chance of human error. You might have multiple signals at a location and it’s confusing to figure out which one to attend to and you may misread and then you end up SPADing and so on. So, there are, for example, clusters of SPADs that will be analysed and then the appropriate analysis will be done. And you wouldn’t want to be punishing drivers if it seemed to be a systems design issue.
Simon: Yes. I saw a vivid illustration of that on the news, I think, last night. There was a news article where there was an air crash that tragically killed three people a few months ago here in South Australia. And the newsies report today is saying it was human error but when they actually got in to reporting what had happened, it was pointed out that the pilot being tested was doing – It was a twin-engine aeroplane and they were doing an engine failure after take-off drill. And the accident report said that the procedure that they were using allowed them to do that engine failure drill at too low an altitude. So, if the pilot failed to take the correct action very quickly – bearing in mind this is a pilot being tested because they are undergoing training – there was no time to recover. So, therefore, the aircraft crashed. So, I thought, ”Well, it’s a little bit unfair just to say it’s a human error when they were doing something that was in intrinsically inappropriate for a person of that skill level.”
Peter: That’s an excellent example and you hear this in the news a lot. Human error, human error and human error. The cause of this, I think, with the recent Boeing problems with the flight control system for the new 737s. And of course, there will be reports. Some of the interim reports already talk about some of these Human Factors, issues inherent in that, and I would encourage people to look up the publicly available documentation on that-
Simon: – This is the Boeing 737 Max accidents in Indonesia and in Ethiopia, I think.
Peter: That’s correct. That’s correct. Yes, absolutely. And pilot error was used as the general explanation but under further analysis, you started looking at that error. That so to speak error perhaps has other causes which are systems design causes, perhaps. So these things are being investigated but have been written about quite extensively. And you can look at, of course, any number of aeroplane accidents and so on. There’s a famous Air France one flying from Brazil to Paris, from what I recall. It might have been Rio de Janeiro to Paris. Where the pitot –
Simon: – Yeah, pitot probes got iced up.
Peter: Probes, they iced up and it was dark. So, the pilots didn’t have any ability to gauge by looking outside. I believe it was dark or it might have been a storm. There’s some difficulty in engaging what was going on outside of the aeroplane and there again misreads. So, stall alarms going off and so off, I believe. There were some mis-readings on the airspeed coming from the sensors, essentially. And then the pilots acted according to that information, but that information was incorrect. So, you could say there were probably a cascade of issues that occurred there and there’s a fairly good analysis one can look up that looks at the design. I believe it was an Airbus. It was the design of the Airbus. So, we had one pilot providing an input in one direction to the control yoke and the other pilot in the other direction. There are a number of things that broke down. And typically, you’ll see this in accidents. You’ll have a cascade as they’re trying to troubleshoot and can’t figure out what’s going on they’ll start applying various approaches to try and remedy the situation and people begin to panic and so on.
And you have training techniques, a crew resource management, which certainly has a strong Human Factors element or comes out of the Human Factors world, which looks at how to have teams and cockpits. And in other situations working effectively in emergency situations And that’s sort of after analysing, of course, failures.
Simon: Yes, and I think CRM, crew resource management, has been adopted not just in the airline industry, but in many other places as well, hasn’t it?
Peter: Operating theatres, for example. There’s quite a bit of work in the 90s that started with I think it was David Gaba who I think was at Stanford – this is all from memory. That then look at operating theatres. In fact, the Monash Medical Centre in Clayton had a simulation centre for operating theatres where they were applying these techniques to training operating theatre personnel. So, surgeons, anaesthetists, nurses and so forth.
Simon: Well, thanks, Peter. I think and I’m sorry, I think I hijacked you’ll the presentation, but –
Peter: It’s not really a presentation anyway. It was more a sort of better guidance there. We’re talking about methods, weren’t we? And it’s easy to go then from methods to talking about accidents. Because then we talk about the application of some of these methods or if these methods are applied to prevent accidents from occurring.
Simon: Cool. Well, thanks very much, Peter. I think maybe I’ll let the next time we have a chat I’ll let you talk through your slides and we’ll have a more in-depth look across the whole breadth of Human Factors.
Peter: So that’s probably a good little intro at the moment anyway. Perhaps I might pull up one slide on Human Factors integration before we end.
Simon: Of course.
Peter: I’ll go back a few slides here.
What is Human Factors Integration?
Peter: And so what is Human Factors integration? I was thinking about this quite a bit recently because I’m working on some complex projects that are very, well, not only complex but quite large engineering projects with lots of people, lots of different groups involved, different contracts and so forth. And the integration issues that occur. They’re not only Human Factors integration issues there are larger-scale integration issues, engineering integration issues. Generally speaking, this is something I think that projects often struggle with. And I was really thinking about the Human Factors angle and Human Factors integration. That’s about ensuring that all of the HF issues, so HF in Human Factors, in a project are considered in control throughout the project and deliver the desired performance and safety improvements. So, three functions of Human Factors integration
- confirm the intendant system performance objectives and criteria
- guide and manage the Human Factors, aspects and design cycles so that negative aspects don’t arise and prevent the system reaching its optimum performance level
- and identify and evaluate any additional Human Factors safety aspect now or we found in the safety case.
You’ll find, particularly in these complex projects, that the interfaces between the – you might have quite a large project and have some projects working on particular components. Let’s say one is working on more of a civil/structural elements and maybe space provisioning and so on versus another one is working more on control systems. And the integration between those becomes quite difficult because you don’t really have that Human Factors integration function working to integrate those two large components. Typically, it’s within those focused project groupings –that’s the way to call them. Does that make sense?
Simon: Yeah. Yeah, absolutely.
Peter: I think that’s one of the big challenges that I’m seeing at the moment, is where you have a certain amount of time and money and resource. This would be common for other engineering disciplines and the integration work often falls by the wayside, I think. And that’s where I think a number of the ongoing Human Factors issues are going to be cropping up some of these large-scale projects for the next 10 to 20 years. Both operationally and perhaps safety as well. Of course, we want to avoid –
Simon: –Yes. I mean, what you’re describing sounds very familiar to me as a safety engineer and I suspect to a lot of engineers of all disciplines who work on large projects. They’re going to recognize that as it is a familiar problem.
Peter: Sure. You can think about if you’ve got the civil and space provisioning sort of aspect of a project and another group is doing what goes into, let’s say, a room into a control room or into a maintenance room and so on. It may be that things are constrained in such a way that the design of the racks in the room has to be done in a way that makes the work more difficult for maintainers. And it’s hard to optimize these things because these are complex projects and complex considerations. And a lot of people are involved in them. The nature of engineering work is typically to break things down into little elements, optimize those elements and bring them all together.
Peter: Human Factors tends to –Well, you can do them Human Factors as well but I would argue that certainly what attracted me to it, is that you tend to have to take a more holistic approach to human behaviour and performance in a system.
Peter: Which is hard.
Simon: Yes, but rewarding. And on that note, thanks very much, Peter. That’s been terrific. Very helpful. And I look forward to our next chat.
Peter: For sure. Me too. Okay, thanks!
Simon: Well, that was our first chat with Peter on the Safety Artisan and I’m looking forward to many more. So, it just remains for me to say thanks very much for watching and supporting the work of what we’re doing and what we’re trying to achieve. I look forward to seeing you all next time. Okay, goodbye.