Human Aspects of Unmanned Aerial Systems

208 views 13 pages ~ 3429 words Print

Unmanned Arial Systems

Unmanned Arial Systems is the term used to describe collective components that run the UAV (Unmanned Arial Vehicle) which include ground stations and other integral elements. The UAVs are airborne vehicles that fly and do other functions without the presence of a pilot onboard. The vehicles utilize electromagnetic waves to enable remote control from a different geographical location. UAVs are either flown by a pilot at a ground control position or fly autonomously as per pre-programmed flight procedures and even the modern ones fly using very complex and highly dynamic automation technology. Today, the UAVs are used by the military for reconnaissance to provide surveillance and intelligence in areas of interest and some UAVs are armed with missiles that can be remotely deployed (The UAV- Unmanned Arial Vehicle). UAVs found popularity during the First World War to aid the soldiers by providing surveillance but serious research went into the study of the UAVs after the successful use by the Israelites during the war against Syria. Interestingly, the early versions of UAVs were called drones because of limited or no level of autonomy in that a controller on the ground controlled all the functions of the drone. Since the days of the drone, UAV technology has rapidly advanced in terms of the number functions performed, stealth operation, and the levels of autonomy. However, despite the autonomic advancement of the UASs, the vehicles still rely on human involvement to a great degree. Therefore, this paper will review the literature on human aspects of UASs to come up with human aspects and their implications on the existing remote operated vehicles and the future of the same vehicles.

Situational Awareness

One of the human aspects involved in UASs is situational awareness. This human factor is studied to enable understanding of human-automation errors and to enable formulation of solutions to this errors. Situational awareness often comes up in regards to aircraft traffic control (ATC) where traffic control personnel often cite the failure to maintain situational analysis as the cause to accidents (Endsley 97). Defines situational analysis in a tree level process. One is the awareness of particular major elements in a situation also called cues perception. Second, is the gestalt understanding and integration of that set of data as per the operational objectives. Finally, situation awareness is the capacity to project future states of the relevant system. The second and third levels are very vital in complex systems. In situational analysis, it is critical to understand the time limits of certain events for one to effectively react to a situation within a time frame that ensures future operation of a system. Information on situations change constantly and therefore the situational awareness of an operator need constant updates for the operator to make accurate future predictions. The operator makes decisions and acts according to the interpreted situation and therefore situation awareness is the major precursor of making a decision. Automation can therefore be made such that it supports a sufficient situational awareness through system interfaces and decision aids. Similarly, situation awareness can be curtailed if the system designers fail to address the situational awareness requirements of the operator sufficiently. According to (Endsley 99), some factors determine the completeness and accuracy of situation analysis. One is attention and the working memory where these factors are all limited to human beings. Attention direction on information and priorities to several bundles of information are all-dependent on the operator. The operators however tend to overlook critical information sometimes even after all the information is immediately available (Jones and Endsley 510). The working memory on the hand is limited by the fact that the memory can understand and integrate only about eight entities or objects at a time. Therefore, it is not easy for an operator to combine, interpret, and make projections based on the many objects of information. It is through training and sufficient experience that an operator can be able to improve the tasks of the working memory. An expectation is another factor that affects attention and how perceived cues are interpreted and integrated. The implications of situation awareness on UAVs are that further research is necessary to explore workload and attention demands of situation awareness to assess interfaces and system design. Another implication is that research is also required to explore suitable training techniques for situation awareness like system understanding, checklists, and workflow patterns (Endsley 101).

Teamwork

Teamwork is the second human aspect involved in UAVs. A team involves two or more persons each with different specific functions collaborating to attain a common goal. Teamwork then becomes the incorporation of knowledge, specific skills, and attitudes, which make the members of a team, optimize and adapt their performance. The dependence between tasks in a system influence how members of the time interact with each other. Now, if there are good dependencies among the tasks, ability to keep up with varying loads of the system greatly improves. The major task is to make the components of an automation into operational “team members”. Although the previous perception that workload decreases with automation, seemed plausible, practical results show that many challenges needing fulfillment by automated team member exist (Klein, Woods and Bradshaw 92). One of the challenges is that the team member must have the ability to tell if it is unable to complete a task in the team. Two, the team member must be able to comprehend the actions and intentions of the other members. Three, it must be capable of predicting future events during tasks execution. Four, the member must be able to adapt to dynamic situations. Five, it must be able to not only signal discrete trouble condition (e.g., overload, redline) but also the pending continuous troubles too. Finally, the automated team member must have the ability to think of costs while performing tasks and making decisions. The implications of teamwork on the UAVs operations include the question that in a UAV Control Station (UCS), what measure of physical separation among UAV pilots affects coordination. Therefore, what are the possibilities of teamwork in the UCS? The second implication is determining types of displays and processes, which ensure that operator’s shift handoffs, do not affect performance concerning mission objectives and safety. The final implication questions the issue of multiple operators per vehicle and the switch between manual flights, automated flight, as well as how the mentioned operations affect teamwork.

System Monitoring

System monitoring is the third human aspect involved in UASs. In modern systems like airlines, many classical tasks have been automated leaving the pilot with monitoring tasks mainly and supervision of automated processes. Compared to the manned airplanes, an operator’s station at the UCS likely contains more instrumentation with displays for camera feeds screens, screens for displaying communication fidelity between the UAVs controls and operator’s controls and displays showing the flight plan of the UAV. Due to the increased monitoring tasks, the cognitive ability of the operator could be rendered insufficient when there is too much information. Therefore, in constructing the UASs, engineers must be careful in determining what and how the information is displayed at the UCS and what information is excluded. Since concentration span of human beings depreciates with time, the personnel management must incorporate regularly changing shifts at the UCS (Wickens 57). Furthermore, according to (Wickens 92), automation is directly proportional to need for monitoring, therefore, the more the automation the more the monitoring tasks. In addition, since the automated system can malfunction, the operator can be added more work in monitoring. Supervisory control is a process where the operator has to monitor an automated process and continuously determine the necessity to initialize a control especially in situations like a malfunction (Moray 42). The automated systems prompt an operator into a supervisory role to make sure commands are accordingly comprehended and executed. An operator has to be fully trained on the automated system to be able to monitor the system, detect errors, and make appropriate decisions. The modern systems are so much automated that there exist complex architecture and numerous varying modes of operation and faults and so the supervisor’s work becomes harder, and therefore this factor must be considered during design. Even more so, computers react faster than human hence an operator cannot always check information in real-time (Bainbridge 276). Use of carefully categorized alarm system can greatly help the monitoring processes and therefore engineers should design alarms designated for different deviations and faults. Future implications of monitoring systems on UAVs include determination of merits and demerits of monitoring in different UAVs interfaces, how to develop better warnings and alarms to get operators’ attention and direct them to identify and correct the problem. The same alarms should also be communicated in such a way to prevent misinterpretation.

Decision Making

Decision-making is the fourth human aspect involved in UASs. Making a decision when fully aware of the outcome is easy but making a decision with no insight of probable future outcome is another different matter, it is not easy. Unfortunately, the uncertain outcome is the nature of the real world. Currently, demand for one’s cognitive abilities has vastly increased in this current era UAS technology. An operator needs to be constantly aware of a system’s procedures; the operator has to be familiar with the system’s decision-making and the process of planning (Bell 963). With the current technological advances, automation has developed to a level where automation makes most of the decisions while in operation. UAVs are an example of such automation, which has enabled the UAVs to assume the better chunk of decisions amid flight. An ethical dilemma then arises in the question of what level of authority should be bestowed on these machines. Problems come when the vehicles computer programs do not define a task of a decision. Additionally, the UAVs cannot assimilate certain sensory data or other different inputs. Hence, even more decision-making tasks are still left for the humans to make. However, issues like memory limitation, stress, and dynamic surrounding easily compromise the cognitive task and decision of the human. Humans are not known for rational decision-making and often a decision is influenced by emotions. Emotions sometimes lead to biases in making a decision that often leads sub-optimal strategies. Therefore, if the majority of the likely biases in decision-making can be mapped, then the training intuitions can train operators to overcome the biases and even decision-making aids provided to alert operators of the decision biases (G. A. Klein 144). “Decision-based on the decision maker’s perception of the situation is correct, but that the perception is in error” (Endsley 101). Problems of understanding the condition at hand leads to poorly made decisions. UAVs automation must be further fine-tuned design wise to such that the vehicle’s operation concurs with an operator’s comprehension of the underlying processes. Engineers of the UAVs therefore, should conduct more research on understanding how decisions are realized in when a UAV system is involved.

Trust in Automation

The fifth human aspect concerning UASs is trust in automation. Automation has enabled machines to take over control of strategic and more advanced functions. For example, the flight management systems (FMS) in airplanes can program an entire flight plan and now machines are like part of a team and poor teamwork between an automation and a human can be expensive and dangerous. Teamwork involves trust among team members and the question arises in fully trusting a machine process. A pilot, for example, will have less trust in a machine process prone to false alarms and warnings but with more refined automation algorithms of FMS, the pilot can trust it to carry out critical functions. With the advancement of the systems, optional freedom of choosing to commit tasks to an automated system and it all comes down to an operator’s decision being based on the perceived benefits offered by the system. The benefits become available when the system has been chosen, therefore the issue of trust arises, and so an operator develops trust level to a system first. (Corritore, Kracher and Wiedenbeck 634). Levels of trust have to then be carefully defined because low trust on automation leads underutilization of a system’s automated resources while the too high level of trust can lead to complacency on the operator’s part, leading to blind devotion on an automated system. Misuse and disuse scenarios have to be avoided since misuse is where humans depend on automation inappropriately while disuse is when human reject the abilities of automation. Disuse scenarios are attributed to lack of automation understanding and past automation failures. From the study done on UAVs operations, operators show a greater lack of trust of the systems prone to errors while performing easy responsibilities than systems with errors on harder tasks (Corritore, Kracher and Wiedenbeck 635). Misuse of automation system come from complacency where an operator’s trust on the automated system increases with time of use and vigilance on the system also decreases with time. Faults occurring after long periods of trust on system likely undetected. Therefore, during abnormal conditions, it is almost impossible for an operator to assume manual the control because of automation opacity caused by too much trust on automation. According to (Cassell 51), the atmosphere of work of an operator greatly affect trust on automation in that, the format and content of instrument interfaces in UAS affect an operator’s trust on the system. So a system with user-friendly interfaces and sufficient data presented through the instruments tend to boost operator’s trust on the system. A system that; provides instant feedback to the operator, presents information trend, has prompts that verify information, is deemed trustworthy. To conclude, with competent automation, UASs should be designed and calibrated carefully to avoid complacent use or disuse.

Ergonomic and Environmental Aspects

The ergonomic or environmental aspect is the sixth human factors that also affect UASs. Telepresence is the first ergonomic aspect where a UAV pilot has to assume control of the vehicle from vast distance gaps using only the vehicle’s vantage point. This is different from pilots in manned aircraft since the UAV pilots have to react to the situation and the surroundings differently. The operator’s situational awareness also changes when the pilot is on the ground. There exist two vantage points in the UAVs, one is exocentric (perspective viewpoint also called the “chase plane view”, and the second is egocentric (“out the windscreen view”). For an operator, switching between different views overloads the operator’s monitoring stamina when scanning and integrating the views mentally (Wickens 210). Seamless integration of egocentric and exocentric views to create one common dynamic viewpoint can greatly alleviate an operator’s monitoring workload. The second environmental aspect is risk-taking behavior and virtual reality. The mention aspect introducers the subject of simulators because the UAV control station closely resembles the simulation interfaces used aviation training institutions. An operator of UAV cannot make mistakes unlike in simulators where a whole simulation exercise is restorable. Now, putting an operator in a situation similar to a computer game can create a sense of immortality causing risky decisions and degraded concentration that can lead to catastrophic outcomes (Gawron 435). Communication is the third environmental aspect. Communication loops that exist are, between UCS and UAV and between UCS and air traffic control (ATC). A UAV operator must constantly understand the situation in a UAV in order to act appropriately. Sufficient communication in UAV-UCS and UCS-ATC is paramount to enable efficient and safe environment (Hilburn, Parasumaran and Singh 161). Furthermore, information through the various data links must be free of delays to avoid poor system performance and risky situations. Control inputs and instruction sets have to work seamlessly especially during emergency protocols. Lack of the sensory cues is the forth ergonomic aspect affecting UASs. UAV pilots do not get the sensory inputs like in for pilots in manned aircraft. For example, a pilot can experience turbulence at first hand and therefore make an appropriate decision based on sensory cues provided by a plane hitting turbulence. For UAV operator, however, has to depend completely on the measurement gauges at the UCS. The gauges are in most times sufficient for an operator but sensory cues could have been an added advantage while making the decision for a sensory feeling reaffirms or help solidify a decision (Calhoun, Draper and Ruff 2145). Sensory isolation is not a favorable situation because an operator solely depends on imagery provided by camera screens and other instrumentation. Addition of more cameras on the UAV can provide an all-around view of the UAV’s surroundings providing better visual experience. A form of augmented reality can also help in simulating the unavailable sensory cues, for example, the use of haptic feedback where when a UAV experiences turbulence, the operator’s seat vibrates to simulate the turbulence. More augmented feedback can greatly improve the workload and situational awareness of the operator.

Reversion to Manual Control

The final human aspect is the reversion to manual control. According to (Wiener 726), extensive use of automated control provides degradation to an operator’s capability to reassume manual control. More importantly is the fact that abrupt resumption of manual control is the necessary cause of action during emergencies where the underperformance by an operator can prove to be costly (Hart 26). Lack of time and inadequate performance feedback of the system prevents the operator from regaining familiarity to the system’s dynamics within an efficient period. An operator has to be competent enough to apply cognitive skills to identify, analyze, and predict a systems possible outcome and finally be able to carry out accurate corrective measures. To conclude, it becomes necessary for UAS institutions to retrain regularly the manual mode operations to the operator through, for example, the scenario-based refresher training. Therefore, an operator will have the adequate skills to shift always between stages of automated flight controls when required.

Conclusion

In conclusion, it is quite clear that despite the extensive automation in the UAS technology, humans still hold an integral position in the operations of the UAVs and obviously during design and engineering of the UASs. Therefore, it is necessary to investigate the human factor aspects of UASs to facilitate efficiency, reduce costs, and improve safety. Among the human aspects that were highlighted in the review include, situational awareness of the operator, teamwork implications of the human and automated system, system monitoring by the operator and decision-making by the operator too. Aspects that are more human included the trust of the operator on automation, environmental factors affecting the UAVs and operators, and finally the issues affecting reversion to manual control by the operator.

Works Cited

Bainbridge, L. “Ironies of automation.” Rasmussen, J, K Duncan and J Leplat. New technology and human error. New York: Wiley, 1997. 271-283. Document.

Bell, D. “Regret in decision making under uncertainty.” Operations Research 1992: 961-971. Document.

Calhoun, G.L., et al. “Utility of a tactile display for cueing faults.” Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting 2002: 2144-2148. Document.

Cassell, J. and Bickmore, T. “External Manifestations of Trustworthiness in the Interface.” Communications of the ACM 43.12 (2000): 50-56. Document.

Corritore, C., B. Karcher and S Wiedenbeck. “An Editorial on Trust and Technology.” International Journal of Human-Computer Studies 58.5 (2003): 633-635. Document.

Endsley, M. R. “Design and Evolution for Situation Awareness enhancement.” Proceeding of the Human Factors Society 32nd Annual Meeting n.d.: 97-101.

Gawron, V. J. “Human factors issues in the development, evaluation, and operation of uninhabited aerial vehicles.” AUVSI ’98: Proceedings of the Association for Unmanned Vehicle Systems International. 1998. 431-438.

Hart, S.G. “Pilots’ workload coping strategies.” In Challenges in aviation human factors: the national plan. Washington, DC: American Institute of Aeronautics and Astronautics., 1990. 25-28. Document.

Jones, D. G., and M. R Endsley. “Sources of situation awareness errors in aviation.” Aviation, Space and Environmental Medicine 67.6 (1996): 507-512. Document.

Klein, G. A. “A Recognition-Primed Decision (RPD) Model of Rapid Decision Making.” Klein, G. A., et al. Decision Making in Action: Models and Methods: Models and Methods. Norwood, NJ: Ablex, 1993. 138-147. Document.

Klein, G. A., et al. “Ten Challenges for Making Automation a ”Team Player” in Joint Human-Agent Activity.“ IEEE Intelligent Systems 19.6 (2004): 91-95. Document.

Moray, N. ”Monitoring behavior and supervisory control.“ Boff, K R, L Kaufman and J P Thomas. Handbook of perception and performance. New York: Wiley & Sons, 1986. 40-51.

Singh, I.L, Hilburn B. and Parasuraman R. ”Effect of feedback on adaptive automation.“ Journal of the Indian Academy of Applied Psychology

25.1-2 (1999): 157-65. Document.

The UAV- Unmanned Arial Vehicle. n.d. Online. 26 April 2018. .

Wickens, C. D. Engineering Psychology, and Human Performance. New York: Harper Collins, 1992. Document.

Wiener, E.L. ”Application of vigilance research: rare, medium, or well done?“ Human Factors 1987: 725-736. Document.

September 11, 2023
Category:

Economics Life Science

Subcategory:

Workforce Work Biology

Subject area:

Workplace Human

Number of pages

13

Number of words

3429

Downloads:

46

Writer #

Rate:

4.6

Expertise Human
Verified writer

JakeS has helped me with my economics assignment. I needed an urgent paper dealing with Brexit. JakeS has been awesome by offering an outline with ten sources that have been used. It helped me to avoid plagiarism and learn more about the subject.

Hire Writer

Use this essay example as a template for assignments, a source of information, and to borrow arguments and ideas for your paper. Remember, it is publicly available to other students and search engines, so direct copying may result in plagiarism.

Eliminate the stress of research and writing!

Hire one of our experts to create a completely original paper even in 3 hours!

Hire a Pro

Similar Categories