ࡱ> SURv#`zRbjbj rzHy (((>RRR$v$$$P$tJ%$vcCz%%%%%%.%% ::::-;>B$DhFCR+%%++CRR%%CC.C.C.+0R%R%:C.+:C.C.:0oRR1%n% Ю /;$E,i1 :L3C0cCs1.1H-1H11HR1%, 'C.'(%%%CC-d%%%cC++++vZ > "d vZ >"vvvRRRRRR Expert Systems and Automation Tools as Human Complements in High Risk Industries Ivan Sikora Emirates Airline, P.O. Box 686, Dubai, United Arab Emirates, fax. +971 4 3167577, ivan.sikora@emirates.com Abstract Aviation is at the spearhead of the technology development. Humans have been considered superior to other live species on Earth. Aviation safety objective and indicators from accident and incident statistics have proven differently. Regardless of technological advancement there is still no risk free transportation. People are becoming the weakest link and one of the major causes for aviation incidents and accidents. Accepting and respecting that, when designing and utilizing transportation systems, should enable all involved to utilize their systems to the lowest risk possible. The paper presents an overview of influence that human psychology has on performance and errors in aviation and other high risk industries together with the way automation and expert systems try to complement them. It has been developed using deduction and analysis to present basic human psychology influence while induction and synthesis have been used to extrapolate information across other transportation industries outside the boundaries of aviation itself. INTRODUCTION Aviation has been a long time at the spearhead of the technology development. Civil aviation and world nations have benefited from technological advancement introduced for war time supremacy usually. Technology, ergonomics and psychology advancements have usually originated from military to be applied in civil aviation. Nowadays boundaries between civil aviation and other industries have been reduced substantially. Psychology as an applied science [1.] is meant to explain measure and sometime even influence human and some other living species. It is usually combined with other scientific disciplines (e.g. sociology, anthropology, etc.) in order to study organizational behaviour. The purpose of this paper is to discuss errors in aviation and other high risk industries together with the way that automation and expert systems try to complement them giving an overview of influence that human psychology has on understanding of human performance and to. Human error is not limited to aviation only. More subtle and less visible to public there are other fields where human error takes substantial number of lives. For example it is understood that operating theatres are even more complex environments than aircraft cockpits. Baggage handlers, make fewer mistakes with our luggage than do medical staff with drugs given to patients in hospital. Merchant marine industry faces one loss of a ship every day (370 ships a year). The quality of people and organization is not guarantee for problem free functioning or mission accomplishment. As Reason [2.] states: Best people can sometimes make the worst mistakes. The greatest problems can even happen to properly run organization. The cause for this is the fact that skill is tied to error very closely by being as an essential attribute of human nature. For that reason important part of human factors studies are focused on nature of error and steps that are necessary to control it. Very early in the history of aviation people have realized that once aircraft or any other mechanical systems performance has been pressured to the limit the only other parameter/ element that could make difference was human operator. Not knowing enough about that aviation users (military and civil) have called for a help from psychologists. (with early focus at the areas of aircrew selection and training related to the effects of sleep loss and fatigue, and various aspects of visual perception and display design) between two World Wars. Today after more than 60 years most designers of technological systems still do not pay enough attention to human needs and capabilities. The result of that approach is safety systems that by the increase of automation become even more difficult or complex to operate. Users are not against technology, but they need technology that pays sufficient attention to human needs and capabilities. Let me define the meaning of major terminology that is going to be used in my paper. Error can be defined as the failure of planned actions to achieve their desired end without the intervention of some unforeseeable event [2.] It results from physiological and psychological limitations of humans. In addition to individual errors there are team errors as well. There are three major error types: skill based (i.e. slips, lapses, trips or fumbles), rule based and knowledge based (i.e. mistakes). Safety is the lack of danger or risk. It is usually measured by the number of negative outcomes. Safety and risk according to Wiener [6.] are determined by environment (social, organizational, etc.) and experience. Once risk has been analysed it is assessed for acceptance or decided to be managed by different means. Managing risk is a process that seeks to increase the likelihood of achieving program/ project goals. High risk industries are: nuclear power industry, the military, the space agencies, the chemical process industry and aviation. Their front line operators (e.g. control room operators and pilots) mistakes can lead to release of destructive mass, energy, chemical and similar. As Reason [2.] concludes: the safety of the system hinge critically on the reliability of a small number of human controllers. Their safety is influenced by the characteristics of the workplace and informational properties of the human-machine interface. Human factors (HF) or ergonomics may be defined as the technology concerned to optimize the relationships between people and their activities by the systematic application of the human sciences (behavioural and medical), integrated within the framework of system engineering. [1.] In regard to safety issues, HF embraces a far wider range of individuals and activities than those associated with the front-line operation of a system. 2. EXPERT SYSTEMS AND AUTOMATION TOOLS FACTS, THEORY AND APPLICATION 2.1. Statistical Facts Aviation has become one of the most reliable and safe modes of transportation overall. The growth of the airline passenger traffic is substantial (e.g. more than 859 billion passenger miles in 1985. only [1.]). There are two accidents that occur in every million departures. Technical systems share in the total number of incidents is very low. Steiner [7.] quotes only 10% of aircraft accidents are caused by aircraft and aircraft system failures. Mechanical cause factors have been reduced significantly in jet accidents, while human error has increased. Experts are indicating the lack of accidents caused by mechanical malfunctions as a testament to their increased reliability.  Fig. 1. A diagram illustrating the dominant role played by human performance in civil aircraft accidents (International Air Transport Association, 1975) The high visibility and often equally large number of fatalities of plane crashes have forced the aviation industry to take error very seriously. In 1995, the risks to passengers (the probability of becoming involved in an accident with at least one fatality) varied by a factor of 42 across the world's air carriers, from a 1 in 260 000 chance of death or injury in the worst cases to a 1 in 11000000 probability in the best cases [1.]. Humans have been considered for long time superior to other live species in our world. Objective and realistic indicators from accident and incident statistics have proven differently. We are becoming the weakest link in the chain. As technical systems have become more reliable, humans have become one of the major cause of incidents and accidents. As Helmreich [7.] states "Causes of error include fatigue, workload, and fear as well as cognitive overload, poor interpersonal communications, imperfect information processing, and flawed decision making." In addition to objective human limitations there is a potential for failure to exploit the full potential of people and technology. Steiner [7.] suggests that preventive measures are needed to overcome such a situation. Overlooking HF, in systems designed exclusively by engineers only, often leads to alienation from technology and frustration with it. In safety critical applications this can lead to global disasters. These reasons make common theme with other high risk industries. Human operators are exposed to similar type of environment and challenges regardless whether they sit in the cockpit of an aircraft, control room of an atomic power plant or at the bridge of a chemical tanker. Studies about error, stress, and teamwork among operating and intensive care unit medical staff and airline pilots from around the world have shown similarities in attitude and work values. 2.2. Systems Theory and Knowledge Aviation and other high risk industries share two most important characteristics: front line operators responsibility and their safety. Safety is influenced by characteristics of the workplace and human machine interface. Understanding of humans psychology, their behaviour and failures should enable users to operate their systems to the lowest risk possible. When designing more efficient human machine systems the most fruitful approach to system design would be according to Wiener [5.]:an initial acceptance of individuals characteristics followed by the choice or design of other resources to match these properties which must be accepted as given. Traditional advantage and burden of aviation compared to other high risk industries are its "visibility" and appeal. Hence aviation has started (early enough) to collect flight data regularly in order to understand what has gone wrong in accidents, and to improve what is not quite right at more recent times. Expert systems and automation can complement humans. Knowing human limitations, through HF studies, can help deciding what type of complement to use and where. The types of errors committed tell about the type of remedy needed. The two most common ways of augmenting human capabilities are: enhancing basic human senses, and assisting human operator in making decisions. The most basic way in which mechanical systems complemented human weaknesses was by enhancing basic human senses. Another type of advanced automation has been enabled by vast increase in computational power and speed of contemporary computers. Underlying idea is that understanding of humans' psychology, their behaviour and failures should enable users to operate their systems to the lowest risk possible. In this case automation as Wiener states [1.] monitors humans and assists the pilot in his role by detecting errors as they are made. 2.3. Systems Examples and Potential for Application It is likely a mistake to assume that technology is the ultimate solution to human error in high risk industry and aviation, just as automation was assumed to be the ultimate cockpit solution. Wiener [1.] points out that automation did not eliminate cockpit errors but changed their type and timing. According to Maurino [8.] automated control systems were introduced to take over from fallible and variable human operator. At the same time human operators were left in the system in order to restore them to a safe state when automation covered scenarios go beyond foreseen circumstances. Although the effective enhancement of the decisions making process of the pilot is a slow process there is a great potential in reducing errors. The problem with automation is computation capability that sometime executes to fast for the individual to understand. Other problems are intricate organization of automation active modes and their recognition by the operator. Last but not the least when automation is working properly it denies operators to practise their basic professional skills that are intended for recovery purpose if automation fails. The major HF concern of pilots in regard to introduction of automation in the cockpit was: "Who is in control?" [1.]. Humans are involved in all phases of systems life. Design, construction, manufacturing, operation, and maintenance have human involvement and therefore there are numerous opportunities for those humans to introduce error unintentionally and unaware of that same fact. Most designers of technological systems do not pay enough attention to human needs and capabilities. Latent errors are usually introduced by people very far "...in time and space" as Reason states [5.] from the direct control interface: designers, high-level decision makers, construction workers, managers and maintenance personnel. Hence two areas of specific HF design issues must be addressed: (a) Proper design of user interface (hardware or software). (b) Integrated process for the design at software around user capabilities and limitations influencing the functions served and modes of application within an organization. The initial construction of standard aircraft instrument layout boards (in 1930s) is very valid proper hardware design example. Some proper software design examples include utilizing high-resolution color graphic displays because people are very good at recognizing graphical patterns way (e.g. monochromatic test scopes or colored text for such applications as aircraft engine monitors; advanced color graphics such as the horizontal and vertical situation indicators; or computer display for monitoring the safety of water-based nuclear power plants developed by Beltracchi (1987).[3.]). This way of presenting information leads to better interactions between people and technology than traditional one. If we talk about the functions served these can be creating awareness; detecting and warning of the presence of off-normal conditions; recovering from off-normal conditions and to restore the system to a safe state. Modes of application within an organization can be for example engineered safety devices or implementation in training within organization Aircraft Flight Management System (FMS) is the example of engineered safety device. FMS has been introduced to support human crew by taking over some tasks or portions of the tasks from them (e.g. flight path guidance, power plant, or environmental control) in favor of discussion and communication of crew members. The example of creating awareness is the file system for of interpretation of medical x-ray pictures where each image is interpreted by emergency room (ER) room doctor, and then reviewed by radiologist within 12 hours. New system reduced six times the potential of error as shown in Fig. 2. Another medical example is rules based system for prescribing and recording of drugs given to patients through wireless terminals. Complete and legible prescriptions have eliminated transcription errors preventing 58 unsafe prescriptions and gave over 700 high level warnings during initial 11 months of application. [9.]  EMBED Excel.Sheet.8  Fig. 2. Change in the number of errors per 1000 images checked. [9.] Last but not the least training within organization support has been enhanced with recent advancement in simulation technology and replay techniques. Developed and perfected in civil aviation they have been transferred to maritime industry [4.] and fire fighting [5.] 3. conclusion Regardless of advancement on the technological front there is still no risk free transportation. That applies in aviation and any other modes of transportation. Human error has played a progressively more important role in accident. Automatic control systems and their problems are not unique to aviation at all. Still they are the most extensively studied within aviation due to flight data recording available. Taking advantage of data recording and investigation design, manufacturing, operation, and maintenance mistakes have been corrected by prescribing strict technical minimum standards that resulted in sophisticated aircraft technology development and its application Charles Lindberg, during his 1927 New York to Paris flight, and today flying crew (in the latest technology airliner or high speed train) face the same problem. Their errors for the majority of cases have not been considered in systems context. Wiener makes very clear [1.] that: although some solutions are intended to remove single or group of errors system as general are not affected at all.. Systems that have had latent errors introduced long time before actual breach of system defences will fail regardless of solutions intended to remove active errors by front-line operators. As Steiner [7.] stresses: preventive actions can be focused either on improvements in design and engineering or the introduction of automation that has been seen as universal remedy with control task sharing between human and machine. The most beneficial approach would be changing engineering approach and promote the importance of designing and applying technologies that work for people performing the full range of their workplace professional activities. Design practices must also be modified to focus on producing technological systems that fulfil human needs as opposed to creating overly complex, technically sophisticated systems that are difficult for the average person to use. The remedy would be for engineers to begin with a human need (rather than a technological possibility) and to focus on the interactions between people and technology (rather than on the technology alone). Technological systems can be designed to match human nature at all scales - physical, psychological, team, and organizational. 4. LITERATURE [1.] E. L. Wiener et al., Human factors in Aviation. Academic Press, San Diego, 1988. [2.] J. Reason, Managing The Risk of Organizational Accidents, Ashgate Publishing Company, Burlington, 1998. [3.] K. J. Vincente, The Human factor, The Bridge, Volume 32, Number 4 - WINTER, 2002. [4.] A. Ali, Simulator Instructor - STCW Requirements and Reality, Pomorstvo, god. 20, br. 2 (2006), str. 23-32 [5.] J. Reason, Human Error, Cambridge University Press, Cambridge, 1990. [6.] R. L. Helmreich, Culture at Work in Aviation and Medicine: National, Organizational and Professional Influences, Ashgate Publishing Company, Burlington, 1998. [7.] S. Steiner, Elementi sigurnosti zracnog prometa. Fakultet prometnih znanosti Sveucilista u Zagrebu, Zagreb, 1999. [8.] D. E. Maurino et al., Beyond Aviation Human factors, Ashgate Publishing Company, Burlington, 1995. [9.] Dr P. Nightingale, Changes In Processes Can Substantially Reduce Error, British Medical Journal, Volume 320, No 7237, March 2000  In 1939. aircrew selection criteria and training has been guided by Medical Research council in UK [1.]  non mandatory simulation system in maritime industry for navigation; watch keeping; ship handling and manoeuvring; cargo handling and stowage  Dynamic Environmental Simulation System DESSY simulation the complex task of fire fighting as seen by fire fighting chief     ',AEFPS^_  M O P U V W _ ` g    8 < T X j x z ǹΝz hy~h) hy~h]aJ hy~hxx hy~h@#Y hy~h6W hy~hu^ hy~hOZ hy~hfn hy~h[/ hy~h0R hy~h` hy~h` hy~hT hy~hS>hy~h{tL6 hy~h{tL hy~h hy~h _ hy~hH-FQRS_  Cgdv*gd  & Fgd ` gdT^dgdS>dLgdHgdHPlRyR   " ) 2 K O P U    ) 4 : ? J T [ e s ϺϳϞϞϗϥϥϓ~ hy~hT8m hy~h{tL hy~hh:# hy~hT^d hy~h hy~h;* hy~hf hy~hCb hy~h_ hy~hT hy~h{s hy~h` hy~hu^ hy~h _ hy~h2O hy~h]aJ hy~hx hy~h/1  (/LVW^%q}7FEJacd$(7>KR Ȋ hy~hkt hy~h hy~hr hy~hht hy~hbNt hy~h hy~hhz hy~hO hy~hRhy~hP mH sH hy~h5 mH sH hy~hw mH sH %,,,,,,,,- -----.........1/u////00 0 0F0P0t0u0v0112!2"2'2F2V2W2ҡ҅~w hy~hLFU hy~h> hy~h{tL hy~hnUP hy~hCF hy~hZv hy~hj hy~hX hy~hqg hy~hY hy~h [ hy~h{`J hy~hEV hy~hahxdC hy~h%~ hy~hr hy~hmK hy~hv* hy~h4-W222223#3$333365555555666666@7D7F7h7k7q7s7t7z7777R888 9999V9^9g9s999ȺȺȳȬȥ hy~h`@] hy~hw*` hy~ht hy~h4 hy~h{= hy~hCF hy~h{`J hy~hC hy~hl_g hy~h` hy~h hy~hr hy~h& hy~hb hy~h7 hy~hd hy~hA hy~hR065F799:::==>>.@QABBBBDDDFGIIKgd-gdF HgdP gdIZgd1xfgd gdzgd=CgdR999999::!:,:w::::::2;;;K;W;g;n;o;;;;;=l====>ʿ࿴zzʅrgr\hy~h1xfmH sH hy~humH sH humH sH hy~hVmH sH hy~hTmH sH hy~h6mH sH  hy~h1xf hy~h hy~h&mH sH hy~h mH sH hy~h mH sH hy~hzmH sH hy~h|mH sH hy~hCmH sH  hy~hz hy~h{`J hy~hP >>>>>>????!?)?-?D?E?N?-@.@2@:@=@F@P@W@g@k@|@@@@@@@@@@@AA*AèWsO$%DF%_>èWsO$JFIF,,LEAD Technologies Inc. V1.01  /9)+"/D ( (՗V ( x2JT~J+y/Y1Oh ( ( ( ( ( ()Z0$èƀ%K$:hz( (՗V (@P@$&t+}(W 8 34>fS.XmwH1צhƵ|98-`3KA 䏣1hՃqх%X_帍r}4jF$9fPI ܶTǻ%AϽ=i|`^ ].Y.pҀ+};ZZHXn}p94PౚHe\O9|V@TXK6zPmݭKp )6+GlFfY 4D f[?{hʀ% (嫙ͱ pC@\y (-Y@h F`Y I4 6p.%[!>4 ( K{ǚJ@h8?%)H@ uM0Y>~^ÜjK+yeocym.d9/)}k) Km;""1Mfo`h {@Ț|0/ R(ӭ;TZQ4eGwdxSlgJ4w'Rqڀ+ivNI0M_i,2PΛ`}g91@ -mBT) Ҁ-6:T3nI(Y'p;I &Ԭ`t=P@R] ȿ}y3?I@P/(P@%kżg2#UTa@-P@P@P@P@P@P@P@P|>P~)ă9P@Z?€*@S8?v bNq>ր& ( ( ( ( ( ( ( (ё*` e)AP/((&37I} 90{-h@P@P@P@P@P@P@P@P@(Y_qPi%N?* eU3Fi S#Ta@P@P@ IUMFu-{ƀ-P@P%8vqs@K!IYx#uFNWm0Ah#ƅ܅P2IѴppc`NUA @&hLFÑP@P@gۏ/HٻР-Y@h$I(?Lyfohv{( ( (3",c@Z5ETG^|W(' qhbtH4~z܀QqK.p?`02;ǻyK,1F/<~2qӌPѴ0$o'Tcv1@P@P@P 7Y@՗V3\}]Uɓ *( ((_Ks{{yD~`$PzPR{[u9N@H>3jv1;I>PB2T-5嘷l1u[Bol1¶ÆP˪ZE##Hw!-aGz @ KY. ;P{# ˑ uyeT]R}(} 6drϦ(j21H~<.ȹIuQn'2Ӑ}-Rqր#ST@$V>ƀ uKhP;}I51FeV5#րoa0f @Z;+G"' ( (>@oǥhP/(83a'=f(F ( ( FTm%!$ fs@ Ww4`gAZI4ily y yYw}(L3L[$kL` q=#n(ݪqIαZȜq@6=(XPZ&mju( ( ( ( ( ( ( ( ( ((M ?މl@s ( ( ( ( (#GoZ;K*ÆSP( ( ( ( ( ecL1۟X/@P@P@P@P@쨅dڀ3 FE̠h}M: -QƴzdeXPP@P@P@P@P uQhf żu{& hP@P@P@P@P@s3jWF2Eg? UT`ڀ2dITg)2R<( ( ( ( ( -k? ? /(Ug/&4@P@P@P@P@(?Zmn,Q*-cIchPU 9OX Z( ( ( ( koTedk([M pqєuP@P@P@PMBфoNG>.wL@P7wE'~m>õ|z( ( ( ((iuYIq@6_P9I z2@4w-7ތ?] ( ( (!KKvC=OUӭ25ttq}(B ( ZY";.#PbT8u4j ( ( ( _&i:mS7NɰP?SiY@h>U ( ( ( Gu܅U$mm] !Ӡ ( (:H&ubvx`Yc9V(J( ( (  iGj @eɶ4@P@P@P@ܘPiP@P@P@eJK3 &S@2AP@P@P@o/Tyk@j @B7F Ta;}:P( ( ( Z1Y 4rbaTPP@P@P:,Q*̷fnMNPP@P@PӋkI%{4VchZ?€*@ZO,4@P@P@T.7\KOs@[Hvt˿v4b ( ( (A}T]%6WGX ( ( (6cM g>Pj @C{nVwC@ 5Ix# P@P@W[Hw<"h-:ѣ-qq\dz @P@P@P@TlxzPif4j ( (/nIպ*:[t(՗V (:MF>Ch+Prd@ @P@2iRIFI QۧQ=iP@P@P@P@[~οn;XA}(Z( ( 'MP/(P2VR0Ef;X}RL.s @tw (''e)mV$g=P( ( ( (P@Wt˟D3o!Ĩ;Q@*FAP@FgUGvnV(F> (՗V (/-pzAܹf>/P@{#BҀ4!!cv@hP@P@P@P:V \,to!&C?}(V ( _f* ? ,m~Ό˷Ҁ,@j @P@o>Ъ*,<*w 1@ :ӫ-7dzVi9=YRhzdGo/4 z'L(Ҳ:9╱jq1&>7c=hY@2=hb`y Ro:@<3Wk'_.z(CnE4hЃS s M3}=jJ}[e,ȵVUN*ԿvgO ӭ̍v)l+-b1s=(K}mDҀČ{A(#XIF@$lU=h ,4 $˓.<7CfXӘ PM[L'@SԬ(v̜zn{rï o/@[uc@Z07D=z v ( eU ( yeʃJ́^P6@Pyb'%C (8tSB}hqhMq@Z{ܲۊkنk~^~Z6qjPtP ɾ]R ۍ٠ &n~ Ys@VR0Eg 2h 6G@i[ *lU#O`݉c(ˏ W˜E- ~6 44r!HFش3n ΠB.`@_<@I9VMkdܛ3Kh;v>r (Q_3S8r v"3J[緰 P@P@ P@P@2h3Oc@6\{-~}=h7QPP@P@P@P@P@2y2Ugx6?yz+XP({K@P@P/(t )+1ai@}-?c#3:Pހ L@P@P@6I$-#QܜP䕰0hi]9aзE @P@P@Z?€*NP@b Pb)˦BBZ ?-N$\?8oʀ$Rg18 PAP@P@P@( (f94 鑗];\H?GP( ( ( eU ( ( (#\K LDi`>_ITP}?c6 PjۯՖ#@]Vź\/ L_Р(7,S?ʀlꄲ4A}(#TS0 /9s!@ൂ~%Op9Z( ( ( (՗}~}~}~}~}~}~}~}~}ҀOҀOҀOҀOҀOҀOҀPOJ>M?J>@ }>7[Ꙡ&#l -Hb$]>4G(co}ҀPOJ>@?('do}ҀPOJ>M?J>@?@?(Xa|ϵ)Dd E9-#0  # A"6o@/Zu&x@=y6o@/ZuY84WGxXMlU];ĉ$'$Q!XPC F"1JĽ8r Z.kJ%T$"BT(לbfֳ9޶)EE}7ޛ}̡܏҉r7ԒKxx-^< 8b[ @x%tOh܅z;پ›,d:Ľt{M%9=s(V  !"#$%&'()*+,-./0123456789:;<=?@ABCDEFGHIJKLMNOPQpTdoXYZ[\]^_`abcsfghijklmntqruxyz{|}~Root Entry  F/;VData >s,WordDocument rzObjectPoolЮ /;/;_1255233986 FЮ /;Ю /;Ole EPRINTWCompObjm  F!Microsoft Office Excel WorksheetBiff8Excel.Sheet.89q Oh+'0@H\p  Ivan Sikora Ivan SikoraMicrosoft Excel@#@0l>G e > EMF) )543F, EMF+@,,F\PEMF+"@ @ $@ 0@?!@ @     !" !" !  Rp Arial0?O x$O $O xz0&J&,ښ0VO  (0VO N0VO Wj0Arial0 f M&f M !VO &&dv% % % % " !%    T% " !%    T&% % % " !% %   q m  6 86 86 6 o6 o6 >6 >6  6 u6 u% % " !% %    T% % " !% %    C% % " !% %   u i% % " !% %   m i&% (    !FxlEMF+*@$??@  @$DAC DD!b K$$==^t$0T ( 8p% % V0zi' 'fJfJ ' % % $$AA( " FEMF+@ " % V0whziziz% !FxlEMF+*@$??@  @$DD DeC!b K$$>>^t$0T ( % % V0 ih? ,h?H3PH3P ,h? ,% % $$AA( " FEMF+@ " % V0 h  ii% % % " !% %   u i% % " !% %    T&% ( u  6ii6i686866o6o6>6>6 6 u6ui6 i6iK6Ki 6 i% % " !% %    T% % " !% %    T   TTl;{A9vAl;LP0 .TTl{A9vAlLP2P.TTl e{A9vAl LP4p.TTlr{A9vAlrLP6E.TTl4{A9vAlLP81.TX>A{A9vA>ALP10..TX>{A9vA>LP12..TX>k{A9vA>LP14..TX>x{A9vA>xLP16..TX>:{A9vA>LP18..TX>G{A9vA>GLP20..% % " !% %    T% % " !% %    T  Tf {A9vA L`Old SystemA.7'*.GT {A9vA L`New System<.;7'*.G% % " !% %    T% % " !% %    T% ( % "  T!  " !  ( " F4(EMF+*@$??!b Ldz hz)??" FEMF+@ ObjInfo WorkbookeSummaryInformation( DocumentSummaryInformation84 \p Ivan Sikora Ba= =<L,$8X@"1Arial1Arial1Arial1Arial1Arial1Arial1Arial1Arial""#,##0;\-""#,##0""#,##0;[Red]\-""#,##0""#,##0.00;\-""#,##0.00#""#,##0.00;[Red]\-""#,##0.005*0_-""* #,##0_-;\-""* #,##0_-;_-""* "-"_-;_-@_-,)'_-* #,##0_-;\-* #,##0_-;_-* "-"_-;_-@_-=,8_-""* #,##0.00_-;\-""* #,##0.00_-;_-""* "-"??_-;_-@_-4+/_-* #,##0.00_-;\-* #,##0.00_-;_-* "-"??_-;_-@_-                + ) , *  `Chart1XSheet1Sheet2Sheet3,"> Old System New SystemNumber of Errors [1/1000]   cc  MnMicrosoft Office Document Imag/ d,,Letterwidm" d,,??3` 8"` 8"` 8"P=3)2d 3Q: 6Number of Errors [1/1000]Q ;Q ;Q3_ O  f K i.@za|ka@O+8k( |>Wide downward diagonal"B` MM<43_ O  f 3 i@zarsv* "ۚ( Large grid"B` MM<4E4D$% MP(3O&Q4$% MP(3O&Q4FA g 3O-JE 3 b#M&43*#M&! M43" 44% MP3OQ'44e Old System New Systeme3@@e> )2   dMbP?_*+%"??U  ~ 3@ ~ @ (>@7   dMbP?_*+%"??U>@7   dMbP?_*+%"??U>@7 ՜.+,0 PXl t|  Emirates Sheet1Sheet2Sheet3Chart1  WorksheetsChartsOh+'0 ,\hx     ;X¼q(8BwtNSэC)t}<96շ^i?\3w#:խ7'WWʣo-w/<'>&ѽxJ5y ,!Fs5r7_fn]rJy +>P.fS]|Zoi}+JYrwܹ]*88fpI~O:R~Iwv,sQu(Ye^>r t /mW jj>(%GQCIAk{Ş,4-[-V$oy+炥;;[]o5F9v^O;9jqceu by䭗G=/@& ʙrYAWIE``kh5 ]a+tG+qC3WL5Cm 5kP>ݽNX!&dGxQRڙr,,L5?<+Dr7o}28nX1ub? #M?Nx3N}t?Q'ᅡ/û7 c>) F2 %Ne3'oCh<3䩓3Jf2 xMgߙɂR+Ss9J ͯMPqbCE;-*Y\nT2&>Ss |Lc< M[b|y,ex;%O{o0+qWqۭWkD_}[/§Uwy/Ggyөq'H^mdMp^aM Ag3}8>1c9ԿEͩms[թ61TablewEHSummaryInformation( <DocumentSummaryInformation8CompObjqHEXPERT SYSTEMS AND AUTOMATION TOOLS AS HUMAN COMPLEMENTS IN HIGH RISK(Ivan Sikora, P.O. BOX 686, Dubai, UAE(safety, technolgy, human factors, riskUPUTEruzica.skurla209Microsoft Office Word@06@6X?wۿ@N @/; >՜.+,0hx  J Korema 2007Emirates Airlines|%H FEXPERT SYSTEMS AND AUTOMATION TOOLS AS HUMAN COMPLEMENTS IN HIGH RISK Title  FMicrosoft Office Word Document MSWordDocWord.Document.89qR@R Normal$`a$OJQJ_HkHmH sH tHZ@Z Heading 1$$`a$5;CJKHOJQJkHH@H Heading 2$$`a$5CJX@X Heading 3$`5;CJOJQJkHP@P Heading 4$$`a$5CJDA@D Default Paragraph FontViV  Table Normal :V 44 la (k(No List 6@6 S Footnote Text@&@@ SFootnote ReferenceH*<@< Header 9r `4 @"4 Footer  9r HOH Summary `56CJOJQJkH:O: formulaxx`:O: Figure$`a$6Oqr6 Figure textDOD formating$`a$6CJ>O> ReferenceP^`P<O< Remarks$`a$60O0 Ending `@O@ Formatting6CJOJQJkH;;zJjx{zJz4zQRS_""W*AE{JK00h  0̇0R 0̇0S 0̇0T 0̇0` 0̇0 0̇0 0̇0 0̇0 0 ̇0 0̇0  0̇0  0̇0  00̇0  00.̇0s& 00/̇0t& 003̇0& 00Ṫ0=@0 00v̇06V 00̇0'v 00̇0wQRS_""W*AE{JK00@ 0̇0R 0̇0S 0̇0T 0̇0` 0̇0 0̇0 0̇0 0̇0 0 ̇0 0̇0  0̇0  0̇0  00̇0  00.̇0s& 00/̇0t& 003̇0& 00Ṫ0=@0 00v̇06V 00̇0'v 00̇0wFQRS_ C7IJ^ ""[%&("*W*+6-F/112225566.8Q9::::<<<>?AACEElEE0FFFGHmHH]IIkJlJnJoJqJrJtJuJwJxJ{J000F0F0F0F0_0_0_0_0_0_0_0_0_( 0_000000000000(080000000000800000800000000000000000000(00000000(0000000000@0@0@000x@000@000@000@00000FQRS_ C7IJ^ ""[%&("*W*+6-F/112225566.8Q9::::<<<>?AACEElEE0FFFGHHI{J 0 00F0F0F0F0_0_0_0_0_0_0_0_0_*0_*0_000@0@0@0@0@0@0@0@0*@0_:@0@0@0K00@0@0@0@0@0@0:@0@0$$@0$$@0$$:@0:@0@0$$@0$$@00@00K050K050K050K040K040K050K050K050K050K070K070K070K070K07000K0>0*0_K0@0K0A0*@0_K0@0K0@0K0>0 *0_@09090909090909 0@0 00  ",W29>@ACMNPzR*-.01235678:;<C65KzR+/49yR,:::zJ: OLE_LINK2 OLE_LINK1{J{J|4M}4 ~ԩt\$l#4#̦#d###ĩ#\#qq>>>[E[EEEFFFF|G|G[H[HUIUI{J     }>>>dEdEEEFFFFGGeHeHWIWI{J 8*urn:schemas-microsoft-com:office:smarttagsCity9*urn:schemas-microsoft-com:office:smarttagsplaceB*urn:schemas-microsoft-com:office:smarttagscountry-region9*urn:schemas-microsoft-com:office:smarttagsState= *urn:schemas-microsoft-com:office:smarttags PlaceName= *urn:schemas-microsoft-com:office:smarttags PlaceType:*urn:schemas-microsoft-com:office:smarttagsStreet;*urn:schemas-microsoft-com:office:smarttagsaddress վ aj++5 5EEEEsF|FFFFFFF`GgGGGGGGGGGGHHH?HFHHlJlJnJnJoJoJqJrJtJuJwJxJ{J <A55::EEGHHHHHIlJlJnJnJoJoJqJrJtJuJwJxJ{J333333333FS""@/F/::::<<HHlJlJnJnJoJoJqJrJtJuJwJxJ{JHlJlJnJnJoJoJqJrJtJuJwJxJ{Jzn4'*^`o(. ^`hH. pLp^p`LhH. @ @ ^@ `hH. ^`hH. L^`LhH. ^`hH. ^`hH. PLP^P`LhH.`szn4ls@ (^`(OJQJo(-          @ @.-&.7n|f_,;>rL%CF]oR* Cr P8 U X Re P  8 ^h k mK!:u>n_=GAO-?]^ju%vEVZt^L{5 q:]h9;?v%/d[Mg$ ^ u 5!" =":#H#Mf$\% & &I'(Q)zE)G)v*k+,5#-a-M-.[/N/B1e20x2(3"Q3O4|5#6/6$;7sU8[/979W9Z9]:K;H[;u;5<J=+d={=S>R>6'?0?s @x@dA7BUBzBuC5#C&C=CxdCBKE\F1yFyGiHF H.ID>e&E_Z_/X[vP*HKI6[|&,eVd'e_s%~ kKdu^4{mLc v&ORF4[=Xz=@V>U6W .Fm[cU8g9]..V|RHQ6j;DVTkt yu&\V#~7ESfn Q63[&A4RHACMIZ5<MV VLm SIpw b (lCK+y,mjxA1+E.:!'KQ_ZYg!7#q 6+`_vN{*MKYrb}-9uCf5g +7qOoVu<4 V|<Ap0R]+DK`:H\IlJnJqJtJwJ{JQ_@HP LaserJet 6LLPT1:winspoolHP LaserJet 6LHP LaserJet 6LC 4dXXA4DINU"`fSMTJHP LaserJet 6LInputBinOption1RESDLLUniresDLLOrientationPORTRAITResolutionOption1PaperSizeLETTEREconomodeOption1RETOption1HalftoneHT_PATSIZE_AUTOHP LaserJet 6LC 4dXXA4DINU"`fSMTJHP LaserJet 6LInputBinOption1RESDLLUniresDLLOrientationPORTRAITResolutionOption1PaperSizeLETTEREconomodeOption1RETOption1HalftoneHT_PATSIZE_AUTOPzJ@@UnknownGz Times New Roman5Symbol3& z Arial"hFS&5Ff >% >%54dHH2QHX?=2&C:\MSOFFICE\WINWORD\TEMPLATE\UPUTE.DOTEEXPERT SYSTEMS AND AUTOMATION TOOLS AS HUMAN COMPLEMENTS IN HIGH RISK&safety, technolgy, human factors, risk%Ivan Sikora, P.O. BOX 686, Dubai, UAE ruzica.skurla