<?xml version="1.0" encoding="utf-8"?>
<sevenpack>
<comment>
This file was created by the Typo3 extension
sevenpack version 0.7.14

--- Timezone: CEST
Creation date: 2018-10-12
Creation time: 09-03-01
--- Number of references
101

</comment>
<reference>
<bibtype>article</bibtype>
<citeid>RozantsevLF2015</citeid>
<title>On Rendering Synthetic Images for Training an Object Detector</title>
<journal>Computer Vision and Image Understanding</journal>
<year>2015</year>
<month>1</month>
<day>20</day>
<abstract>We propose a novel approach to synthesizing images that are effective for training object detectors. Starting from a small set of real images, our algorithm estimates the rendering parameters required to synthesize similar images given a coarse 3D model of the target object. These parameters can then be reused to generate an unlimited number of training images of the object of interest in arbitrary 3D poses, which can then be used to increase classification performances.

A key insight of our approach is that the synthetically generated images should be similar to real images, not in terms of image quality, but rather in terms of features used during the detector training. We show in the context of drone, plane, and car detection that using such synthetically generated images yields significantly better performances than simply perturbing real images or even synthesizing images in such way that they look very realistic, as is often done when only limited amounts of training data are available.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.sciencedirect.com/science/article/pii/S1077314214002446</web_url>
<state></state>
<DOI>10.1016/j.cviu.2014.12.006</DOI>
<authors>
<person><fn>A.</fn><sn>Rozantsev</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>PerfectJW2015</citeid>
<title>Handling Qualities Requirements for Future Personal Aerial Vehicles</title>
<journal>Journal of Guidance, Control, and Dynamics</journal>
<year>2015</year>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<state>accepted</state>
<DOI>10.2514/1.G001073</DOI>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>PerfectJW2015_2</citeid>
<title>Methods to Assess the Handling Qualities Requirements for Personal Aerial Vehicles</title>
<journal>Journal of Guidance, Control, and Dynamics</journal>
<year>2015</year>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<state>accepted</state>
<DOI>10.2514/1.G000862</DOI>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>Schuchardt2015</citeid>
<title>Towards Handling Qualities Evaluations of a Personal Aerial Vehicle in Ground-Based and In-Flight Simulation</title>
<year>2015</year>
<month>5</month>
<day>5</day>
<pages>1-12</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_name>American Helicopter Society Forum 71, Virginia Beach, Virginia</event_name>
<authors>
<person><fn>B.I.</fn><sn>Schuchardt</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>OlivariNBP2015_2</citeid>
<title>Identifying Time-Varying Neuromuscular Response: Experimental Evaluation of a RLS-based Algorithm</title>
<year>2015</year>
<pages>1-15</pages>
<abstract>Methods for identifying neuromuscular response commonly assume time-invariant neuromuscular dynamics. However, neuromuscular dynamics are likely to change during realistic control scenarios. In a previous paper we presented a method for identifying time-varying neuromuscular dynamics based on a Recursive Least Squares (RLS) algorithm. To date, this method has only been validated in a Monte Carlo simulation study. This paper presents an experimental validation of the same method. In the experiment, three different disturbance-rejection tasks were performed: a position task with the human instructed to minimize the stick deflection in front of an external force disturbance, a relax task with the instruction to relax the arm, and a time-varying task with the instruction to alternate between position and relax tasks. The position and relax tasks induce different time-invariant neuromuscular dynamics, whereas the time-varying task induces time-varying neuromuscular dynamics. The RLS-based method was used to estimate neuromuscular dynamics in the three tasks. The neuromuscular estimates were reliable both in time-invariant and time-varying tasks. These findings indicate that the RLS-based method can be used to estimate time-varying neuromuscular responses in human-in-the loop experiments.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_name>Scitech 2015</event_name>
<DOI>10.2514/6.2015-0658</DOI>
<authors>
<person><fn>M</fn><sn>Olivari</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>OlivariNBP2014_2</citeid>
<title>Pilot Adaptation to Different Classes of Haptic Aids in Tracking Tasks</title>
<journal>Journal of Guidance, Control, and Dynamics</journal>
<year>2014</year>
<month>11</month>
<day>1</day>
<volume>37</volume>
<number>6</number>
<pages>1741-1753</pages>
<abstract>Haptic aids have been largely used in manual control tasks to complement the visual information through the sense of touch. To analytically design a haptic aid, adequate knowledge is needed about how pilots adapt their visual response and the biomechanical properties of their arm (i.e., admittance) to a generic haptic aid. In this work, two different haptic aids, a direct haptic aid and an indirect haptic aid, are designed for a target tracking task, with the aim of investigating the pilot response to these aids. The direct haptic aid provides forces on the control device that suggest the right control action to the pilot, whereas the indirect haptic aid provides forces opposite in sign with respect to the direct haptic aid. The direct haptic aid and the indirect haptic aid were tested in an experimental setup with nonpilot participants and compared to a condition without haptic support. It was found that control performance improved with haptic aids. Participants significantly adapted both their admittance and visual response to fully exploit the haptic aids. They were more compliant with the direct haptic aid force, whereas they showed stiffer neuromuscular settings with the indirect haptic aid, as this approach required opposing the haptic forces.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://arc.aiaa.org/doi/abs/10.2514/1.G000534</web_url>
<DOI>10.2514/1.G000534</DOI>
<authors>
<person><fn>M.</fn><sn>Olivari</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>TrzcinskiCL2014</citeid>
<title>Learning Image Descriptors with Boosting</title>
<journal>IEEE Transactions on Pattern Analysis and Machine Intelligence</journal>
<year>2014</year>
<month>7</month>
<day>29</day>
<volume>PP</volume>
<number>99</number>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6867341</web_url>
<DOI>10.1109/TPAMI.2014.2343961</DOI>
<authors>
<person><fn>T.</fn><sn>Trzcinski</sn></person>
<person><fn>M.</fn><sn>Christoudias</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>MWSSMR2014</citeid>
<title>Motion and Uncertainty-Aware Path Planning for Micro Aerial Vehicles</title>
<journal>Special Issue on Low-Altitude Flight of UAVs, Journal of Field Robotics</journal>
<year>2014</year>
<month>6</month>
<day>5</day>
<volume>31</volume>
<number>4</number>
<pages>676–698</pages>
<abstract>Localization and state estimation are reaching a certain maturity in mobile robotics, often providing both a precise robot pose estimate at a point in time and the corresponding uncertainty. In the bid to increase the robots' autonomy, the community now turns to more advanced tasks, such as navigation and path planning. For a realistic path to be computed, neither the uncertainty of the robot's perception nor the vehicle's dynamics can be ignored. In this work, we propose to specifically exploit the information on uncertainty, while also accounting for the physical laws governing the motion of the vehicle. Making use of rapidly exploring random belief trees, here we evaluate offline multiple path hypotheses in a known map to select a path exhibiting the motion required to estimate the robot's state accurately and, inherently, to avoid motion in modes, where otherwise observable states are not excited. We demonstrate the proposed approach on a micro aerial vehicle performing visual-inertial navigation. Such a system is known to require sufficient excitation to reach full observability. As a result, the proposed methodology plans safe avoidance not only of obstacles, but also areas where localization might fail during real flights compensating for the limitations of the localization methodology available. We show that our planner actively improves the precision of the state estimation by selecting paths that minimize the uncertainty in the estimated states. Furthermore, our experiments illustrate by comparison that a naive planner would fail to reach the goal within bounded uncertainty in most cases.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201406_JFR_Achtelik.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://onlinelibrary.wiley.com/doi/10.1002/rob.21522/abstract</web_url>
<DOI>10.1002/rob.21522</DOI>
<authors>
<person><fn>M.W.</fn><sn>Achtelik</sn></person>
<person><fn>S</fn><sn>Lynen</sn></person>
<person><fn>S</fn><sn>Weiss</sn></person>
<person><fn>M</fn><sn>Chli</sn></person>
<person><fn>R</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>SunCLF2014</citeid>
<title>Real-time landing place assessment in man-made environments</title>
<journal>Machine Vision and Applications</journal>
<year>2014</year>
<month>1</month>
<day>1</day>
<volume>25</volume>
<number>1</number>
<pages>211-227</pages>
<abstract>We propose a novel approach to the real-time landing site detection and assessment in unconstrained man-made environments using passive sensors. Because this task must be performed in a few seconds or less, existing methods are often limited to simple local intensity and edge variation cues. By contrast, we show how to efficiently take into account the potential sites’ global shape, which is a critical cue in man-made scenes. Our method relies on a new segmentation algorithm and shape regularity measure to look for polygonal regions in video sequences. In this way, we enforce both temporal consistency and geometric regularity, resulting in very reliable and consistent detections. We demonstrate our approach for the detection of landable sites such as rural fields, building rooftops and runways from color and infrared monocular sequences significantly outperforming the state-of-the-art.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://link.springer.com/article/10.1007%2Fs00138-013-0560-7</web_url>
<DOI>10.1007/s00138-013-0560-7</DOI>
<authors>
<person><fn>X.</fn><sn>Sun</sn></person>
<person><fn>C. M.</fn><sn>Christoudias</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>JumpPWL2015</citeid>
<title>Handling Qualities and Training Requirements for Personal Aerial Vehicles</title>
<year>2014</year>
<month>10</month>
<day>28</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Aachen, Germany</event_place>
<event_name>4th EASN Association International Workshop on Flight Physics &amp; Aircraft Design, Aachen, Germany</event_name>
<authors>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>L.</fn><sn>Lu</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>OlivariNBP2015</citeid>
<title>Identifying time-varying neuromuscular system with a recursive least-squares algorithm: a Monte-Carlo simulation study</title>
<year>2014</year>
<month>10</month>
<pages>3573-3578</pages>
<abstract>A human-centered design of haptic aids aims at tuning the force feedback based on the effect it has on human behavior. For this goal, a better understanding of the influence of haptic aids on the pilot neuromuscular response becomes crucial. In realistic scenarios, the neuromuscular response can continuously vary depending on many factors, such as environmental factors or pilot fatigue. This paper presents a method that online estimates time-varying neuromuscular dynamics during force-related tasks. This method is based on a Recursive Least Squares (RLS) algorithm and assumes that the neuromuscular
response can be approximated by a Finite Impulse Response filter. The reliability and the robustness of the method were investigated by performing a set of Monte-Carlo simulations with increasing level or remnant noise. Even with high level of remnant noise, the RLS algorithm provided accurate estimates when the neuromuscular dynamics were constant or changed slowly. With instantaneous changes, the RLS algorithm needed almost 8s to converge to a reliable estimate. These results seem to indicate that RLS algorithm is a valid tool for estimating online time-varying admittance.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&amp;arnumber=6974484</web_url>
<event_name>IEEE International Conference on Systems, Man and Cybernetics (SMC 2014)</event_name>
<DOI>10.1109/SMC.2014.6974484</DOI>
<authors>
<person><fn>M</fn><sn>Olivari</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>ScheerBC2014</citeid>
<title>Is the novelty-P3 suitable for indexing mental workload in steering tasks?</title>
<journal>12th Biannual Meeting of the German Cognitive Science (KogWis 2014)</journal>
<year>2014</year>
<month>9</month>
<day>30</day>
<pages>135-136</pages>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/20140930_KogWis_Scheer.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.uni-tuebingen.de/en/faculties/faculty-of-science/departments/interdepartmental-centres/cognitive-science-in-tuebingen/kogwis-2014.html</web_url>
<publisher>Springer</publisher>
<address>Berlin, Germany</address>
<event_place>Tübingen, Germany</event_place>
<event_name>12th Biannual Meeting of the German Cognitive Science (KogWis 2014)</event_name>
<authors>
<person><fn>M.</fn><sn>Scheer</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>SunCF2014</citeid>
<title>Free-Shape Polygonal Object Localization</title>
<year>2014</year>
<month>9</month>
<day>6</day>
<pages>317-332</pages>
<abstract>Polygonal objects are prevalent in man-made scenes. Early approaches to detecting them relied mainly on geometry while subsequent ones also incorporated appearance-based cues. It has recently been shown that this could be done fast by searching for cycles in graphs of line-fragments, provided that the cycle scoring function can be expressed as additive terms attached to individual fragments. In this paper, we propose an approach that eliminates this restriction. Given a weighted line-fragment graph, we use its cyclomatic number to partition the graph into managebly-sized sub-graphs that preserve nodes and edges with a high weight and are most likely to contain object contours. Object contours are then detected as maximally scoring elementary circuits enumerated in each sub-graph. Our approach can be used with any cycle scoring function and multiple candidates that share line fragments can be found. This is unlike in other approaches that rely on a greedy approach to finding candidates. We demonstrate that our approach significantly outperforms the state-of-the-art for the detection of building rooftops in aerial images and polygonal object categories from ImageNet.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/20140906_ECCV_Sun.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://link.springer.com/chapter/10.1007/978-3-319-10599-4_21</web_url>
<editor>David Fleet, Tomas Pajdla, Bernt Schiele, Tinne Tuytelaars</editor>
<booktitle>Computer Vision – ECCV 2014, Proceedings Part 6, Lecture Notes in Computer Science</booktitle>
<event_place>Zürich, Switzerland</event_place>
<event_name>Computer Vision – ECCV 2014</event_name>
<DOI>10.1007/978-3-319-10599-4_21</DOI>
<authors>
<person><fn>X.</fn><sn>Sun</sn></person>
<person><fn>C.M.</fn><sn>Christoudias</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>GerboniGONBP2014</citeid>
<title>Development of a 6 DOF Nonlinear Helicopter
Model for the MPI Cybermotion Simulator</title>
<year>2014</year>
<month>9</month>
<day>3</day>
<pages>1-12</pages>
<abstract>This paper describes the different phases of realizing and validating a helicopter model for the MPI CyberMotion Simulator (CMS). The considered helicopter is a UH-60 Black Hawk. The helicopter model was developed based on equations and parameters available in literature. First, the validity of the model was assessed by performing tests based on ADS-33E-PRF criteria using closed loop controllers and with a non-expert pilot. Results on simulated data were similar to results obtained with the real helicopter. Second, the validity of the model was assessed with a helicopter pilot in-the-loop in both a fixed-base simulator and the CMS. The pilot performed a vertical remask maneuver defined in ADS-33E-PRF. Most metrics for performance were reached
adequately with both simulators. The motion cues in the CMS allowed for improvements in some of the metrics.
The pilot was also asked to give a subjective evaluation of the model by answering the Israel Aircraft Industries
Pilot Rating Scale (IAI PRS). Similarly to results of ADS-33E-PRF, pilot responses confirmed that the motion
cues provided more realistic flight experience.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201409_ERF_Gerboni.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://aerosociety.com/About-Us/specgroups/Rotorcraft/ERF-2014</web_url>
<publisher>Royal Aeronautical Society</publisher>
<event_place>Southampton, United Kingdom</event_place>
<event_name>40th European Rotorcraft Forum, Southampton, United Kingdom</event_name>
<authors>
<person><fn>C.A.</fn><sn>Gerboni</sn></person>
<person><fn>S.</fn><sn>Geluardi</sn></person>
<person><fn>M.</fn><sn>Olivari</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>LuJPWA2015</citeid>
<title>Development of occupant-preferred landing profiles for personal aerial vehicle applications</title>
<year>2014</year>
<month>9</month>
<day>2</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Grand Harbour Hotel, Southampton, UK</event_place>
<event_name>40th European Rotorcraft Forum, Southampton, United Kingdom</event_name>
<authors>
<person><fn>L.</fn><sn>Lu</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>M.</fn><sn>Aldridge</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PerfectWJ2015</citeid>
<title>Development of pilot training requirements for personal aerial vehicles</title>
<year>2014</year>
<month>9</month>
<day>2</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Southampton, UK</event_place>
<event_name>40th European Rotorcraft Forum, Southampton, United Kingdom</event_name>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>FladNBC2014</citeid>
<title>System Delay in Flight Simulators Impairs Performance and Increases Physiological Workload</title>
<year>2014</year>
<month>6</month>
<day>22</day>
<pages>3-11</pages>
<abstract>Delays between user input and the system’s reaction in control tasks have been shown to have a detrimental effect on performance. This is often accompanied by increases in self-reported workload. In the current work, we sought to identify physiological measures that correlate with pilot workload in a conceptual aerial vehicle that suffered from varying time delays between control input and vehicle response. For this purpose, we measured the skin conductance and heart rate variability of 8 participants during flight maneuvers in a fixed-base simulator. Participants were instructed to land a vehicle while compensating for roll disturbances under different conditions of system delay. We found that control error and the self-reported workload increased with increasing time delay. Skin conductance and input behavior also reflect corresponding changes. Our results show that physiological measures are sufficiently robust for evaluating the adverse influence of system delays in a conceptual vehicle model.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201406_HCI_Flad.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.springerprofessional.de/001---system-delay-in-flight-simulators-impairs-performance-and-increases-physiological-workload/5155788.html;jsessionid=3C14314C37ADE8F064676C642166560C.sprprofltc0101</web_url>
<publisher>Springer International Publishing</publisher>
<event_place>Heraklion, Crete, Greece</event_place>
<event_name>11th HCI International 2014, Engineering Psychology and Cognitive Ergonomics</event_name>
<ISBN>978-3-319-07515-0</ISBN>
<authors>
<person><fn>N.</fn><sn>Flad</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>ScheerNBC2014</citeid>
<title>The Influence of Visualization on Control Performance in a Flight Simulator</title>
<year>2014</year>
<month>6</month>
<day>22</day>
<pages>202-211</pages>
<abstract>Flight simulators are often assessed in terms of how well they imitate the physical reality that they endeavor to recreate. Given that vehicle simulators are primarily used for training purposes, it is equally important to consider the implications of visualization in terms of its influence on the user’s control performance. In this paper, we report that a complex and realistic visual world environment can result in larger performance errors compared to a simplified, yet equivalent, visualization of the same control task. This is accompanied by an increase in subjective workload. A detailed analysis of control performance indicates that this is because the error perception is more variable in a real world environment.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201406_HCI_Scheer.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.springerprofessional.de/021---the-influence-of-visualization-on-control-performance-in-a-flight-simulator/5155812.html;jsessionid=3C14314C37ADE8F064676C642166560C.sprprofltc0101</web_url>
<publisher>Springer International Publishing</publisher>
<event_place>Heraklion, Crete, Greece</event_place>
<event_name>11th HCI International 2014, Engineering Psychology and Cognitive Ergonomics</event_name>
<ISBN>978-3-319-07515-0</ISBN>
<authors>
<person><fn>M.</fn><sn>Scheer</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>LuJPW2014</citeid>
<title>Development of a Visual Landing Profile with Natural-Feeling Cues</title>
<journal>Proceedings of the American Helicopter Society Forum 70</journal>
<year>2014</year>
<month>5</month>
<day>21</day>
<pages>1-12</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>https://vtol.org/store/product/development-of-a-visual-landing-profile-with-naturalfeeling-cues-9519.cfm</web_url>
<event_place>Montreal, Quebec, Canada</event_place>
<event_name>American Helicopter Society Forum 70, Montreal Quebec, Canada</event_name>
<authors>
<person><fn>L.</fn><sn>Lu</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>GeluardiNPB2014</citeid>
<title>Frequency Domain System Identification of a Light Helicopter in Hover</title>
<year>2014</year>
<month>5</month>
<day>21</day>
<pages>1721-1731</pages>
<abstract>This paper presents the implementation of a Multi-Input Single-Output fully coupled transfer function model of a civil light helicopter in hover. A frequency domain identification method is implemented. It is discussed that the chosen frequency range of excitation allows to capture some important rotor dynamic modes. Therefore, studies that require coupled rotor/body models are possible. The pitch-rate response with respect to the longitudinal cyclic is considered in detail throughout the paper. Different transfer functions are evaluated to compare the capability to capture the main helicopter dynamic modes. It is concluded that models with order less than 6 are not able to model the lead-lag dynamics in the pitch axis. Nevertheless, a transfer function model of the 4th order can provide acceptable results for handling qualities evaluations. The identified transfer function models are validated in the time domain with different input signals than those used during the identification and show good predictive capabilities. From the results it is possible to conclude that the identified transfer function models are able to capture the main dynamic characteristics of the considered light helicopter in hover.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201405_AHS_Geluardi.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.vtol.org/annual-forum/forum-70/forum-70</web_url>
<event_place>Montreal, Quebec, Canada</event_place>
<event_name>AHS International's 70th Annual Forum and Technology Display, Montreal, Quebec, Canada</event_name>
<authors>
<person><fn>S</fn><sn>Geluardi</sn></person>
<person><fn>FM</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>L</fn><sn>Pollini</sn></person>
<person><fn>HH</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>JonesPJW2014</citeid>
<title>Investigation of Novel Concepts for Control of a Personal Air Vehicle</title>
<year>2014</year>
<month>5</month>
<day>21</day>
<pages>1-16</pages>
<abstract>This paper reports results from a study to investigate the use of novel control systems designed to provide a safe and reliable method for the control of a future Personal Aerial Vehicle (PAV). Previous research into vehicle response type requirements, conducted within the EC FP-7 project myCopter,
has centered around ‘conventional’ rotorcraft-type control configurations. In this paper, the use of response and control characteristics derived from road vehicles is investigated. Objective and subjective techniques are used to quantify and qualify the applicability of automobile-like response characteristics using traditional helicopter control inceptors modified to behave somewhat like automobile controls, using foot pedals to control an
Acceleration Command, Speed Hold system, for example. Additionally, the effects of eliminating vehicle pitch and roll dynamics are investigated to determine whether this allows a reduction in workload for non-professional pilots. Results suggest that, particularly for the most inexperienced of pilots, the automobile-like configuration is more suitable for control of a PAV than an augmented set of helicopter-style response types. This is shown through increased performance, and a reduction in subjective NASA Task Load Index (TLX) ratings. Improved Handling Qualities Ratings (HQRs) were also obtained in the automobile-like system for the tasks undertaken. Overall, removal of pitch and roll dynamics was not found to significantly affect task
performance in the automobile-like system, but their absence
resulted in a decrease in performance for the rotorcraft-style response type configurations tested.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>https://vtol.org/store/product/investigation-of-novel-concepts-for-control-of-a-personal-air-vehicle-9517.cfm</web_url>
<publisher>AHS</publisher>
<event_place>Montréal, Québec, Canada</event_place>
<event_name>AHS 70th Annual Forum and Technology Display 2014, Montreal, Quebec, Canada</event_name>
<authors>
<person><fn>M</fn><sn>Jones</sn></person>
<person><fn>P</fn><sn>Perfect</sn></person>
<person><fn>M</fn><sn>Jump</sn></person>
<person><fn>M</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PerfectJW2014</citeid>
<title>Investigation of Personal Aerial Vehicle Handling Qualities Requirements for Harsh Environmental Conditions</title>
<year>2014</year>
<month>5</month>
<day>21</day>
<pages>1-14</pages>
<abstract>This paper describes the continuing research at the University of Liverpool in the myCopter project to develop handling qualities guidelines and criteria for a new category of aircraft – the personal aerial vehicle (PAV), which, it is envisaged, should demand no more skill to fly than that associated with driving a car today. Previously published research showed that a translational rate command (TRC) response type allowed a majority of ‘flight-naïve’ pilots to operate within desired performance limits in a series of hover and low speed tasks in good environmental conditions. This paper extends the research by exploring the impact of degrading the usable cue environment and introducing atmospheric disturbances on performance in these tasks. Results from simulation trials involving test subjects with little or no flight experience are reported, showing that, in general, task performance can be maintained with the TRC response type, although workload increases. The paper concludes that the TRC response type remains suitable for use by ‘flight-naïve’ pilots in PAVs, even in degraded environmental conditions.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201405_AHS_UoL.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>https://vtol.org/annual-forum/forum-70</web_url>
<publisher>American Helicopter Society International, Inc.</publisher>
<event_place>Montreal, Quebec, Canada</event_place>
<event_name>AHS International's 70th Annual Forum and Technology Display, Montreal, Quebec, Canada</event_name>
<authors>
<person><fn>Philip</fn><sn>Perfect</sn></person>
<person><fn>Michael</fn><sn>Jump</sn></person>
<person><fn>Mark D</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>OlivariNBP2014</citeid>
<title>An Experimental Comparison of Haptic and Automated Pilot Support Systems</title>
<year>2014</year>
<month>1</month>
<day>13</day>
<pages>1-11</pages>
<abstract>External aids are required to increase safety and performance during the manual control of an aircraft. Automated systems allow to surpass the performance usually achieved by pilots. However, they suffer from several issues caused by pilot unawareness of the control command from the automation. Haptic aids can overcome these issues by showing their control command through forces on the control device. To investigate how the transparency of the haptic control action influences performance and pilot behavior, a quantitative comparison between haptic aids and automation is needed. An experiment was conducted in which pilots performed a compensatory tracking task with haptic aids and with automation. The haptic aid and the automation were designed to be equivalent when the pilot was out-of-the-loop, i.e., to provide the same control command. Pilot performance and control effort were then evaluated with pilots in-the-loop and contrasted to a baseline condition without external aids. The haptic system allowed pilots to improve performance compared with the baseline condition. However, automation outperformed the other two conditions. Pilots control effort was reduced by the haptic aid and the automation in a similar way. In addition, the pilot open-loop response was estimated with a non-parametric estimation method. Changes in the pilot response were observed in terms of increased crossover frequency with automation, and decreased neuromuscular peak with haptics.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201401_SCITECH_Olivari.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://arc.aiaa.org/doi/abs/10.2514/6.2014-0809</web_url>
<publisher>AIAA</publisher>
<event_place>National Harbour, Maryland</event_place>
<event_name>AIAA Modeling and Simulation Technologies Conference</event_name>
<DOI>10.2514/6.2014-0809</DOI>
<authors>
<person><fn>M.</fn><sn>Olivari</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>NieuwenhuizenB2014</citeid>
<title>Evaluation of Haptic Shared Control and a Highway-in-the-Sky Display for Personal Aerial Vehicles</title>
<year>2014</year>
<month>1</month>
<day>13</day>
<pages>1-9</pages>
<abstract>Highway-in-the-sky displays and haptic shared control could provide an easy-to-use control interface for non-expert pilots. In this paper, various display and haptic approaches
are evaluated in a fight control task with a personal aerial vehicle. It is shown that a tunnel or a wall representation of the flight trajectory lead to best performance and lowest control activity and effort. Similar results are obtained when haptic guidance cues are based
on the error of a predicted position of the vehicle with respect to the flight trajectory.
Such haptic cues are also subjectively preferred by the pilots. This study indicates that the combination of a haptic shared control framework and highway-in-the-sky display can provide non-expert pilots with an easy-to-use control interface for flying a personal aerial
vehicle.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201401_SCITECH_FMN.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://arc.aiaa.org/doi/abs/10.2514/6.2014-0808</web_url>
<publisher>AIAA</publisher>
<event_place>National Harbor, Maryland</event_place>
<event_name>AIAA Modeling and Simulation Technologies Conference</event_name>
<DOI>10.2514/6.2014-0808</DOI>
<authors>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inbook</bibtype>
<citeid>Decker2014</citeid>
<title>Who is taking over? Technology assessment of autonomous (service) robots</title>
<year>2014</year>
<pages>91-110</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.peterlang.com/index.cfm?event=cmp.ccc.seitenstruktur.detailseiten&amp;seitentyp=produkt&amp;pk=62839</web_url>
<editor>Funk, Michael / Irrgang, Bernhard</editor>
<publisher>Peter Lang</publisher>
<series>Dresden Philosophy of Technology Studies. Vol. 5</series>
<booktitle>Robotics in Germany and Japan. Philosophical and technical perspectives</booktitle>
<ISBN>ISBN 978-3-631-62071-7</ISBN>
<authors>
<person><fn>M</fn><sn>Decker</sn></person>
</authors>
</reference>
<reference>
<bibtype>techreport</bibtype>
<citeid>RozantsevLF2014</citeid>
<title>On Rendering Synthetic Images for Training an Object Detector</title>
<year>2014</year>
<month>6</month>
<day>16</day>
<abstract>We propose a novel approach to synthesizing images that are effective for training object detectors. Starting from a small set of real images, our algorithm estimates the rendering parameters required to synthesize similar images given a coarse 3D model of the target object. These parameters can then be reused to generate an unlimited number of training images of the object of interest in arbitrary 3D poses, which can then be used to increase classification performances.
A key insight of our approach is that the synthetically generated images should be similar to real images, not in terms of image quality, but rather in terms of features used during the classifier training. We demonstrate the
benefits of using such synthetically generated images in the context of drone detection, where limited amount of training data is available.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://infoscience.epfl.ch/record/199706?ln=en</web_url>
<institution>Ecole Polytechnique Federale de lausanne</institution>
<authors>
<person><fn>A.</fn><sn>Rozantsev</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>GlatzBC2014</citeid>
<title>Looming auditory warnings initiate earlier eventrelated potentials in a manual steering task</title>
<journal>12th Biannual Conference of the German Cognitive Science Society (KogWis 2014)</journal>
<year>2014</year>
<month>9</month>
<day>30</day>
<pages>S38</pages>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/20140930_KogWis_Glatz.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.uni-tuebingen.de/en/faculties/faculty-of-science/departments/interdepartmental-centres/cognitive-science-in-tuebingen/kogwis-2014.html</web_url>
<authors>
<person><fn>C.</fn><sn>Glatz</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>SymeonidouOBC2014</citeid>
<title>The role of direct haptic feedback in a compensatory tracking task</title>
<journal>12th Biannual Meeting of the German Cognitive Science (KogWis 2014)</journal>
<year>2014</year>
<month>9</month>
<day>30</day>
<pages>S71</pages>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/20140930_KogWis_Symeonidou.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.uni-tuebingen.de/en/faculties/faculty-of-science/departments/interdepartmental-centres/cognitive-science-in-tuebingen/kogwis-2014.html</web_url>
<authors>
<person><fn>E.-R.</fn><sn>Symeonidou</sn></person>
<person><fn>M.</fn><sn>Olivari</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>FladC2014</citeid>
<title>Setting up a high-fidelity flight simulator to study closed-loop control and physiological workload</title>
<journal>Interdisciplinary College (IK) 2014, Cognition 3.0 - the social mind in the connected world</journal>
<year>2014</year>
<month>3</month>
<day>16</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<authors>
<person><fn>N.</fn><sn>Flad</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Decker2014_2</citeid>
<title>Flying to work? Results from the EU-Project myCopter</title>
<year>2014</year>
<month>12</month>
<day>10</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Lisbon, Portugal</event_place>
<event_name>Winter School on Technology Assessment 2014</event_name>
<authors>
<person><fn>M.</fn><sn>Decker</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Achtelik2014_3</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2014</year>
<month>12</month>
<day>10</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Villingen, Switzerland</event_place>
<event_name>Paul Scherrer Institute</event_name>
<authors>
<person><fn>M.</fn><sn>Achtelik</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Nieuwenhuizen2014_2</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems. An overview of accomplishments</title>
<year>2014</year>
<month>11</month>
<day>6</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.airtec.aero/index.php?id=55</web_url>
<institution>Airtec</institution>
<event_place>Frankfurt a. Main, Germany</event_place>
<event_name>5. Internationale HELI World Konferenz</event_name>
<authors>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Nieuwenhuizen2014</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2014</year>
<month>10</month>
<day>28</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.easn.net/workshops/1/21/</web_url>
<event_place>RWTH Aachen, Germany</event_place>
<event_name>4th EASN Association International Workshop on Flight Physics and Aircraft Design</event_name>
<authors>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Achtelik2014</citeid>
<title>Vision Based Control And Navigation For Personal Aerial Vehicles in GPS restricted Environments</title>
<year>2014</year>
<month>10</month>
<day>28</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>RWTH Aachen, Germany</event_place>
<event_name>4th EASN Association International Workshop on Flight Physics and Aircraft Design</event_name>
<authors>
<person><fn>M.</fn><sn>Achtelik</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2014_2</citeid>
<title>Personal aviation: From Cierva gyroplanes to myCopter and beyond</title>
<year>2014</year>
<month>10</month>
<day>1</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<web_url>http://aerosociety.com/Events/Event-List/1644/Cierva-Named-Lecture-2014</web_url>
<institution>Royal Aeronautical Society</institution>
<event_place>London</event_place>
<event_name>Cierva Named Lecture 2014, Royal Aeronautical Society</event_name>
<authors>
<person><fn>Heinrich H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Achtelik2014_2</citeid>
<title>The ASL SLAM and state estimation ROS stack</title>
<year>2014</year>
<month>9</month>
<day>14</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Chicago, USA</event_place>
<event_name>IROS 2014</event_name>
<authors>
<person><fn>M.</fn><sn>Achtelik</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2014_3</citeid>
<title>My visions on air mobility</title>
<year>2014</year>
<month>8</month>
<day>30</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Keynote Lecture</talk_type>
<web_url>http://www.air14.ch/internet/air14/en/home/themen.html</web_url>
<event_place>Payerne , Switzerland</event_place>
<event_name>Air14, International Aviation and Space Symposium</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2014</citeid>
<title>Projekt myCopter: Die Autos der Zukunft werden fliegen</title>
<year>2014</year>
<month>6</month>
<day>18</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<web_url>http://www.2bahead.com/nc/tv/rede/video/projekt-mycopter-die-autos-der-zukunft-werden-fliegen/</web_url>
<institution>2b AHEAD ThinkTank GmbH</institution>
<event_place>Wolfsburg</event_place>
<event_name>2b AHEAD ThinkTank Zukunftskongress 2014</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>BiegBBC2013</citeid>
<title>Saccade reaction time asymmetries during task-switching in pursuit tracking</title>
<journal>Experimental Brain Research</journal>
<year>2013</year>
<month>10</month>
<day>1</day>
<volume>230</volume>
<number>3</number>
<pages>271-281</pages>
<abstract>We investigate how smooth pursuit eye movements affect the latencies of task-switching saccades. Participants had to alternate their foveal vision between a continuous pursuit task in the display center and a discrete object discrimination task in the periphery. The pursuit task was either carried out by following the target with the eyes only (ocular) or by steering an on-screen cursor with a joystick (oculomanual). We measured participants’ saccadic reaction times (SRTs) when foveal vision was shifted from the pursuit task to the discrimination task and back to the pursuit task. Our results show asymmetries in SRTs depending on the movement direction of the pursuit target: SRTs were generally shorter in the direction of pursuit. Specifically, SRTs from the pursuit target were shorter when the discrimination object appeared in the motion direction. SRTs to pursuit were shorter when the pursuit target moved away from the current fixation location. This result was independent of the type of smooth pursuit behavior that was performed by participants (ocular/oculomanual). The effects are discussed in regard to asymmetries in attention and processes that suppress saccades at the onset of pursuit.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://link.springer.com/article/10.1007%2Fs00221-013-3651-9</web_url>
<DOI>10.1007/s00221-013-3651-9</DOI>
<PUBMED>23934441</PUBMED>
<authors>
<person><fn>H.-J.</fn><sn>Bieg</sn></person>
<person><fn>J.-P.</fn><sn>Bresciani</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>VenrooijAMvMvB2013</citeid>
<title>A Biodynamic Feedthrough Model Based on Neuromuscular Principles</title>
<journal>IEEE Transaction on Cybernetics</journal>
<year>2013</year>
<month>9</month>
<day>9</day>
<volume>PP</volume>
<number>99</number>
<abstract>A biodynamic feedthrough (BDFT) model is proposed that describes how vehicle accelerations feed through the human body, causing involuntary limb motions and so involuntary control inputs. BDFT dynamics strongly depend on limb dynamics, which can vary between persons (between-subject variability), but also within one person over time, e.g., due to the control task performed (within-subject variability). The proposed BDFT model is based on physical neuromuscular principles and is derived from an established admittance model---describing limb dynamics---which was extended to include control device dynamics and account for acceleration effects. The resulting BDFT model serves primarily the purpose of increasing the understanding of the relationship between neuromuscular admittance and biodynamic feedthrough. An added advantage of the proposed model is that its parameters can be estimated using a two-stage approach, making the parameter estimation more robust, as the procedure is largely based on the well documented procedure required for the admittance model. To estimate the parameter values of the BDFT model, data are used from an experiment in which both neuromuscular admittance and biodynamic feedthrough are measured. The quality of the BDFT model is evaluated in the frequency and time domain. Results provide strong evidence that the BDFT model and the proposed method of parameter estimation put forward in this paper allows for accurate BDFT modeling across different subjects (accounting for between-subject variability) and across control tasks (accounting for within-subject variability).</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<DOI>10.1109/TCYB.2013.2280028</DOI>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>D.A.</fn><sn>Abbink</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>M.M.</fn><sn>van Paassen</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>F.C.T.</fn><sn>van der Helm</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>VenrooijMAvMvB2013</citeid>
<title>Mathematical Biodynamic Feedthrough Model Applied to Rotorcraft</title>
<journal>IEEE Transactions on Cybernetics</journal>
<year>2013</year>
<month>9</month>
<day>5</day>
<volume>PP</volume>
<number>99</number>
<abstract>Biodynamic feedthrough (BDFT) occurs when vehicle accelerations feed through the human body and cause involuntary control inputs. This paper proposes a model to quantitatively predict this effect in rotorcraft. This mathematical BDFT model aims to fill the gap between the currently existing black box BDFT models and physical BDFT models. The model structure was systematically constructed using asymptote modeling, a procedure described in detail in this paper. The resulting model can easily be implemented in many typical rotorcraft BDFT studies, using the provided model parameters. The model's performance was validated in both the frequency and time domain. Furthermore, it was compared with several recent BDFT models. The results show that the proposed mathematical model performs better than typical black box models and is easier to parameterize and implement than a recent physical model.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<DOI>10.1109/TCYB.2013.2279018</DOI>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>D.A.</fn><sn>Abbink</sn></person>
<person><fn>M.M.</fn><sn>van Paassen</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>F.C.T.</fn><sn>van der Helm</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>VenroojPMvB2013</citeid>
<title>A practical biodynamic feedthrough model for helicopters</title>
<journal>CEAS Aeronautical Journal</journal>
<year>2013</year>
<month>7</month>
<day>1</day>
<volume>4</volume>
<number>4</number>
<pages>421-432</pages>
<abstract>Biodynamic feedthrough (BDFT) occurs when vehicle accelerations feed through the pilot’s body and cause involuntary motions of limbs, resulting in involuntary control inputs. BDFT can severely reduce ride comfort, control accuracy and, above all, safety during the operation of rotorcraft. Furthermore, BDFT can cause and sustain rotorcraft-pilot couplings. Despite many different studies conducted in past decades—both within and outside of the rotorcraft community—BDFT is still a poorly understood phenomenon. The complexities involved in BDFT have kept researchers and manufacturers in the rotorcraft domain from developing robust ways of dealing with its effects. A practical BDFT pilot model, describing the amount of involuntary control inputs as a function of accelerations, could pave the way to account for adverse BDFT effects. In the current paper, such a model is proposed. Its structure is based on the model proposed by Mayo (15th European Rotorcraft Forum, Amsterdam, pp. 81-001–81-012 1989), and its accuracy and usability are improved by incorporating insights from recently obtained experimental data. An evaluation of the model performance shows that the model describes the measured data well and that it provides a considerable improvement to the original Mayo model. Furthermore, the results indicate that the neuromuscular dynamics have an important influence on the BDFT model parameters.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<DOI>10.1007/s13272-013-0083-y</DOI>
<authors>
<person><fn>J.</fn><sn>Venrooj</sn></person>
<person><fn>M.D.</fn><sn>Pavel</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>F.C.T.</fn><sn>van der Helm</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>WeissALAKCS2013</citeid>
<title>Monocular Vision for Long-term Micro Aerial Vehicle State Estimation: A Compendium</title>
<journal>Journal of Field Robotics</journal>
<year>2013</year>
<month>5</month>
<day>13</day>
<volume>30</volume>
<number>5</number>
<pages>803-831</pages>
<abstract>The recent technological advances in Micro Aerial Vehicles (MAVs) have triggered great interest in the robotics
community, as their deployability in missions of surveillance and reconnaissance has now become a realistic
prospect. The state of the art, however, still lacks solutions that can work for a long duration in large, unknown,
and GPS-denied environments. Here, we present our visual pipeline and MAV state-estimation framework,
which uses feeds from a monocular camera and an Inertial Measurement Unit (IMU) to achieve real-time and
onboard autonomous flight in general and realistic scenarios. The challenge lies in dealing with the power and
weight restrictions onboard a MAV while providing the robustness necessary in real and long-term missions.
This article provides a concise summary of our work on achieving the first onboard vision-based power-on-and-
go system for autonomous MAV flights. We discuss our insights on the lessons learned throughout the different
stages of this research, from the conception of the idea to the thorough theoretical analysis of the proposed
framework and, finally, the real-world implementation and deployment. Looking into the onboard estimation
of monocular visual odometry, the sensor fusion strategy, the state estimation and self-calibration of the system,
and finally some implementation issues, the reader is guided through the different modules comprising our
framework. The validity and power of this framework are illustrated via a comprehensive set of experiments
in a large outdoor mission, demonstrating successful operation over flights of more than 360 m trajectory and
70 m altitude change.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<DOI>10.1002/rob.21466</DOI>
<authors>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>M.W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Lynen</sn></person>
<person><fn>M.C.</fn><sn>Achtelik</sn></person>
<person><fn>L.</fn><sn>Kneip</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>article</bibtype>
<citeid>VenrooijMAvvBM2013</citeid>
<title>A New View on Biodynamic Feedthrough Analysis:
Unifying the Effects on Forces and Positions</title>
<journal>IEEE Transactions on Cybernetics</journal>
<year>2013</year>
<month>2</month>
<volume>43</volume>
<number>1</number>
<pages>129-142</pages>
<abstract>When performing a manual control task, vehicle accelerations can cause involuntary limb motions, which can result in unintentional control inputs. This phenomenon is called biodynamic feedthrough (BDFT). In the past decades, many studies into BDFT have been performed, but its fundamentals are still only poorly understood. What has become clear, though, is that BDFT is a highly complex process, and its occurrence is influenced by many different factors. A particularly challenging topic in BDFT research is the role of the human operator, which is not only a very complex but also a highly adaptive system. In literature, two different ways of measuring and analyzing BDFT are reported. One considers the transfer of accelerations to involuntary forces applied to the control device (CD); the other considers the transfer of accelerations to involuntary CD deflections or positions. The goal of this paper is to describe an approach to unify these two methods. It will be shown how the results of the two methods relate and how this knowledge may aid in understanding BDFT better as a whole. The approach presented is based on the notion that BDFT dynamics can be described by the combination of two transfer dynamics: 1) the transfer dynamics from body accelerations to involuntary forces and 2) the transfer dynamics from forces to CD deflections. The approach was validated using experimental results.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<DOI>10.1109/TSMCB.2012.2200972</DOI>
<PUBMED>22752141</PUBMED>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>Mark</fn><sn>Mulder</sn></person>
<person><fn>D. A.</fn><sn>Abbink</sn></person>
<person><fn>M. M.</fn><sn>van Paassen</sn></person>
<person><fn>F. C. T.</fn><sn>van der Helm</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>Max</fn><sn>Mulder</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>NieuwenhuizenCB2013</citeid>
<title>myCopter: Enabling Technologies for Personal Aerial Transportation Systems: Project status after 2.5 years</title>
<year>2013</year>
<month>11</month>
<day>7</day>
<pages>1-3</pages>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201311HeliWorld_FMN.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.airtec.aero/index.php?id=164</web_url>
<event_place>Frankfurt, Germany</event_place>
<event_name>5th International HELI World Conference "HELICOPTER Technologies", "HELICOPTER Operations" at the International Aerospace Supply Fair AIRTEC 2013</event_name>
<authors>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>LynenAWCS2013</citeid>
<title>A Robust and Modular Multi-Sensor Fusion Approach
Applied to MAV Navigation</title>
<year>2013</year>
<month>11</month>
<day>3</day>
<abstract>It has been long known that fusing information
from multiple sensors for robot navigation results in increased
robustness and accuracy. However, accurate calibration of the
sensor ensemble prior to deployment in the field as well as
coping with sensor outages, different measurement rates and
delays, render multi-sensor fusion a challenge. As a result,
most often, systems do not exploit all the sensor information
available in exchange for simplicity. For example, on a mission
requiring transition of the robot from indoors to outdoors, it is
the norm to ignore the Global Positioning System (GPS) signals
which become freely available once outdoors and instead,
rely only on sensor feeds (e.g., vision and laser) continuously
available throughout the mission. Naturally, this comes at
the expense of robustness and accuracy in real deployment.
This paper presents a generic framework, dubbed
Multi-
Sensor-Fusion Extended Kalman Filter
(MSF-EKF), able to
process delayed, relative and absolute measurements from a
theoretically unlimited number of different sensors and sensor
types, while allowing self-calibration of the sensor-suite online.
The modularity of MSF-EKF allows seamless handling of
additional/lost sensor signals during operation while employing
a state buffering scheme augmented with Iterated EKF (IEKF)
updates to allow for efficient re-linearization of the prediction
to get near optimal linearization points for both absolute and
relative state updates. We demonstrate our approach in outdoor
navigation experiments using a Micro Aerial Vehicle (MAV)
equipped with a GPS receiver as well as visual, inertial, and
pressure sensors.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201311_IROS_ETHZlynen.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.iros2013.org/</web_url>
<publisher>IEEE</publisher>
<booktitle>Proc. of the IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), 2013.</booktitle>
<event_place>Tokyo, Japan</event_place>
<event_name>IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), 2013</event_name>
<state>unpublished</state>
<authors>
<person><fn>S.</fn><sn>Lynen</sn></person>
<person><fn>M.W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>AchtelikLCS2013</citeid>
<title>Inversion Based Direct Position Control and Trajectory Following of Micro Aerial Vehicles</title>
<year>2013</year>
<month>11</month>
<day>3</day>
<abstract>In this work, we present a powerful, albeit simple
position control approach for Micro Aerial Vehicles (MAVs)
targeting specifically multicopter systems. Exploiting the differential
flatness of four of the six outputs of multicopters,
namely position and yaw, we show that the remaining outputs
of pitch and roll need not be controlled states, but rather
just need to be known. Instead of the common approach of
having multiple cascaded control loops (position - velocity -
acceleration/attitude - angular rates), the proposed method
employs an outer control loop based on dynamic inversion,
which directly commands angular rates and thrust. The inner
control loop then reduces to a simple proportional controller on
the angular rates. As a result, not only does this combination
allow for higher bandwidth compared to common control
approaches, but also eliminates many mathematical operations
(only one trigonometric function is called), speeding up the
necessary processing especially on embedded systems. This
approach assumes a reliable state estimation framework, which
we are able to provide with through previous work. As a result,
with this work, we provide the missing elements necessary for
a complete approach on autonomous navigation of MAVs.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201311_IROS_ETHZachtelik.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.iros2013.org/</web_url>
<publisher>IEEE</publisher>
<booktitle>Proc. of the IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), 2013</booktitle>
<event_place>Tokyo, Japan</event_place>
<event_name>IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), 2013</event_name>
<state>unpublished</state>
<authors>
<person><fn>M.W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Lynen</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>SchipplDMF2013</citeid>
<title>Personal air vehicles as a new option for commuting in Europe: vision or illusion?</title>
<year>2013</year>
<month>10</month>
<day>2</day>
<pages>1-20</pages>
<abstract>A broad range of technology trajectories can be observed enabling new mobility options and services in future transport system. Usually, Information and Communication technologies (ICT) are playing a key-role in this context. Well in line with the objectives of sustainable transport there usually is a focus on making transport modes cleaner, reducing mobility needs or enabling a modal shift to more efficient modes of transport. But the future is open and hardly predictable, even if there surely is a potential for governing also complex socio-technical systems such as a transport system in a desired direction.

Nevertheless, it is always possible and should not be ignored that “surprises” emerge on the scene which have not been really anticipated by the majority of experts in the field. Prominent examples can be found in the ICT sector, with the extremely fast diffusion of personal computers and cell phones. To give an example from the transport sector: for long time it has not really been envisioned that the market penetration of e-mobility will make its first success story in the bicycle sector. Foresight or “monitoring” activities are useful approaches to enable an early detection of such developments and to help avoiding surprises

Against this background, this presentation will have a closer look at the potentials of personal air vehicles (PAV) to gain market shares in the transport sector. Visions about PAV can be traced back to the early 20th century. Nowadays, considerable amounts of demonstrators (also known as flying cars or roadable aircraft) are being developed, for some of them commercialization is announced to come soon. These approaches are often neglected in transport related visions and scenarios or simply denounced as being not realistic or outright fantastic. The presentation will discuss whether more attention needs to be put on developments in the PAV sector in the development of transportation scenarios and policies for the coming decades.

The paper to be presented is based on work carried out in context of the FP7 project “MyCopter” (www.mycopter.eu). The central idea of the project is to avoid the typical problems associated with ground-based transportation by using the third dimension, combining the best of ground-based and air-based transportation. The solution pursued in MyCopter is the creation of a personal air transport system (PATS) that can overcome the environmental and financial costs associated with current methods of transport. To enable this PATS Personal Aerial Vehicles (PAVs) are envisioned for traveling between homes and workplaces. They should be flying at low altitude in urban environments. Such PAVs should be fully or partially autonomous without requiring ground-based air traffic control and operate outside controlled airspace. They should be designed in a way that allows for using battery based electric propulsion systems.

The presentation will illustrate how scenarios for a future integration of PAV in the transport system could look like. According to the project specifications, the focus will be on commuting. Using examples from German cities (which are among the most congested cities in Europe and where a financially strong group of potential “early adopters” can be expected) the paper assesses the presuppositions for and implications of a market penetration of PAVs. One scenario to be discussed will be offering air vehicles as a sort of taxi-like service, flying fully autonomous and carrying two people at maximum. But other scenarios will be outlined as well. It will be shown that to certain extent environmental and technical challenges might be solved; but – quite similar to motorized individual transport – a key challenge is related to the limited infrastructure capacities, especially for landing and storing vehicles in central business districts. Furthermore, public acceptance will be a crucial issue. Based on this work, it will be possible to provide a clearer picture on the advantages and disadvantages of PAVs – in particular as regards potential impacts on sustainability.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201310_ETC_KIT.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://abstracts.aetransport.org/paper/index/id/233/confid/1</web_url>
<publisher>Association for European Transport</publisher>
<event_place>Frankfurt a. Main, Germany</event_place>
<event_name>European Transport Conference 2013</event_name>
<authors>
<person><fn>J.</fn><sn>Schippl</sn></person>
<person><fn>M.</fn><sn>Decker</sn></person>
<person><fn>S.</fn><sn>Meyer-Soylu</sn></person>
<person><fn>T.</fn><sn>Fleischer</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>GurskyM2013</citeid>
<title>Novel Steering Concepts For Personal Aerial Vehicles</title>
<year>2013</year>
<month>9</month>
<day>11</day>
<abstract>Against the background of constantly growing ground-based traffic and consequently increasing congestion problems, solutions have to be found for meeting the future demand of personal transportation. The European project myCopter is addressing this issue by investigating technologies for future Personal Aerial Vehicles (PAV). These rotorcraft are meant to be available to the general public with a minimal necessary amount of training.
This paper is looking for answers to the question of the most suitable control concept for future PAVs. Car-like steering concepts would be a candidate for flight-naïve PAV users. Several concepts have already been designed for rotorcraft but have not further been investigated. DLR is now facing this challenge. In the paper an overview of the historical development of control devices in automobiles and helicopter is given. From this development and research results from related projects a novel control concept for PAVs is proposed. The intention is to offer a control concept that is intuitively understood by PAV users who are already used to steering automobiles. The concept as well as the underlying PAV flight dynamics are explained and a short outlook is given on the planned future research at DLR.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201309_DLRK2013.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.dlrk2013.dglr.de/</web_url>
<event_place>Stuttgart, Germany</event_place>
<event_name>Deutscher Luft- und Raumfahrtkongress 2013</event_name>
<authors>
<person><fn>B. I.</fn><sn>Gursky</sn></person>
<person><fn>D.</fn><sn>Müller</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>GeluardiNPB2013</citeid>
<title>Data collection for developing a dynamic model of a light helicopter</title>
<year>2013</year>
<month>9</month>
<day>4</day>
<abstract>At the Max Planck Institute for Biological Cybernetics the influence of an augmented system on helicopter pilots with limited flight skills is being investigated. This study would provide important contributions in the research field on personal air transport systems. In this project, the flight condition under study is the hover. The first step is the implementation of a rigid-body dynamic model. This could be used to perform handling qualities evaluations for comparing the pilot performances with and without augmented system. This paper aims to provide a lean procedure and a reliable measurement setup for the collection of the flight test data. The latter are necessary to identify the helicopter dynamic model. The mathematical and technical tools used to reach this purpose are described in detail. First, the measurement setup is presented, used to collect the piloted control inputs and the helicopter response. Second, a description of the flight maneuvers and the pilot training phase is taken into consideration. Finally the flight test data collection is described and the results are showed to assess and validate the setup and the procedure presented.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201309_ERF_MPI.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://erf2013.org/</web_url>
<event_place>Moscow, Russia</event_place>
<event_name>39th European Rotocraft Forum (ERF)</event_name>
<authors>
<person><fn>S.</fn><sn>Geluardi</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>L.</fn><sn>Pollini</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PerfectWJ2013_2</citeid>
<title>Pilot sensitivity to handling qualities-related design parameters for a personal aerial vehicle concept</title>
<year>2013</year>
<month>9</month>
<day>4</day>
<abstract>This paper describes research underway at the University of Liverpool in the myCopter project to develop handling qualities guidelines and criteria for a new category of aircraft – the personal aerial vehicle, which it is envisaged, should demand no more skill than that associated with driving a car today. The focus of this paper is on assessing the sensitivity of ‘flight naïve’ pilots to changes in the characteristics of a translational rate command (TRC) response type and the force-feel of a traditional centre stick inceptor. The experiments identified an acceptable band of TRC velocity rise time characteristics to be between 2.5s and 5.0s. Only small variations in performance and workload were identified for changes in the force feel characteristics, although increased performance was noted with higher spring gradients.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201309_ERF_UoL.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://erf2013.org/</web_url>
<event_place>Moscow, Russia</event_place>
<event_name>39th European Rotocraft Forum (ERF)</event_name>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>ChuangNB2013_2</citeid>
<title>A Fixed-Based Flight Simulator Study: The Interdependence of Flight Control Performance and Gaze Efficiency.</title>
<year>2013</year>
<month>7</month>
<day>21</day>
<pages>95-104</pages>
<abstract>Here, a descriptive study is reported that addresses the relationship between flight control performance and instrument scanning behavior. This work was performed in a fixed-based flight simulator. It targets the ability of untrained novices to pilot a lightweight rotorcraft in a flight scenario that consisted of fundamental mission task elements such as speed and altitude changes. The results indicate that better control performance occurs when gaze is more selective for and focused on key instruments. Ideal instrument scanning behavior is proposed and its relevance for training instructions and visual instrument design is discussed.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201307_EPCE_MPI.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://hcii2013.org/</web_url>
<event_place>Las Vegas, Nevada, USA</event_place>
<event_name>10th International Conference on Engineering Psychology and Cognitive Ergonomics (EPCE) Applications and Services, 2013</event_name>
<DOI>10.1007/978-3-642-39354-9_11</DOI>
<authors>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>BiegBC2013</citeid>
<title>Attentional Biases during Steering Behavior</title>
<year>2013</year>
<month>7</month>
<day>21</day>
<pages>21-27</pages>
<abstract>In the current study, we examine eye movements of human operators during a combined steering and discrimination task. In this task, observers had to alternate their gaze between a central steering task and a discrimination task in the periphery. Our results show that the observer’s gaze behavior is influenced by the motion direction of the steering task. Saccade reaction times (SRTs) of saccades to the discrimination target were shorter if the target appeared in the steering direction. SRTs back to the steering task were shorter when the steering target moved away from the discrimination target. These effects are likely the result of motion-related attention shifts and an interaction of the saccadic and smooth pursuit eye movement system.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201307_DHM_Bieg.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.hcii2013.org/index.php?module=pagesmith&amp;uop=view_page&amp;id=18</web_url>
<editor>Vincent G. Duffy</editor>
<publisher>Springer Berlin Heidelberg</publisher>
<event_place>Las Vegas, Nevada, USA</event_place>
<event_name>Digital Human Modeling and Applications in Health, Safety, Ergonomics, and Risk Management: Healthcare and Safety of the Environment and Transport, 4th International Conference DHM 2013, Held as Part of HCI International 2013</event_name>
<DOI>10.1007/978-3-642-39173-6_3</DOI>
<authors>
<person><fn>H.-J.</fn><sn>Bieg</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>DropPvMB2013</citeid>
<title>Feedforward and Feedback Control Behavior in Helicopter Pilots during a Lateral Reposition Task</title>
<year>2013</year>
<month>5</month>
<day>21</day>
<abstract>Pure feedback and pure open-loop feedforward helicopter pilot models are currently applied for predicting the performance of pilot-helicopter systems. We argue that feedback models are likely to underestimate performance in many realistic helicopter maneuvers, whereas inverse simulation models, which have an open-loop feedforward structure, are likely to overestimate performance as they neglect typical human-in-the-loop characteristics. True verification of feedback and feedforward elements in helicopter pilot control behavior was never performed, however. This paper proposes a pilot model containing a feedback and feedforward controller acting simultaneously and presents a method to identify the hypothesized feedforward action from human-in-the-loop data collected in a simulator experiment. The results of the human- n-the-loop experiment show that actual human performance is better than predicted by a pure feedback model and worse than predicted by an (inverse dynamics) feedforward model. The identification results suggest that the human pilot indeed utilizes feedforward strategies, but it was not possible to either confirm or refute the model by means of the collected data and the developed analysis method.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201305_AHS_MPI.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>https://vtol.org/annual-forum</web_url>
<event_place>Phoenix, Arizona, USA</event_place>
<event_name>AHS 69th Annual Forum and Technology Display</event_name>
<authors>
<person><fn>F. M.</fn><sn>Drop</sn></person>
<person><fn>D. M.</fn><sn>Pool</sn></person>
<person><fn>M. M.</fn><sn>van Paassen</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PerfectWJ2013</citeid>
<title>Towards Handling Qualities Requirements for Future Personal Aerial Vehicles</title>
<year>2013</year>
<month>5</month>
<day>21</day>
<pages>1-21</pages>
<abstract>This paper describes research under way at the University of Liverpool in the myCopter project to develop handling qualities guidelines and criteria for a new category of aircraft – the personal aerial vehicle, which it is envisaged will demand no more skill than that associated with driving a car today. Testing has been conducted both with test pilots and pilots with less experience – ranging from private pilot’s license holders through to those with no prior flight experience. The objective has been to identify, for varying levels of flying skill, the response type requirements in order to ensure safe and precise flight. The work has shown that conventional rotorcraft response types such as rate command, attitude hold and attitude command, attitude hold are unsuitable for likely PAV pilots. However, response types such as translational rate command and acceleration command, speed hold permit ‘flight naïve’ pilots to repeatedly perform demanding tasks with the required precision.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201305_AHS2013.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>https://vtol.org/annual-forum</web_url>
<event_place>Phoenix, Arizona</event_place>
<event_name>AHS 69th Annual Forum and Technology Display</event_name>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M. D.</fn><sn>White</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>AchtelikWCS2013</citeid>
<title>Path Planning for Motion Dependent State Estimation on Micro Aerial Vehicles.</title>
<year>2013</year>
<month>5</month>
<day>6</day>
<abstract>With navigation algorithms reaching a certain maturity in the field of mobile robots, the community now focuses on more advanced tasks like path planning towards increased autonomy. While the goal is to efficiently compute a path to a target destination, the uncertainty in the robot’s perception cannot be ignored if a realistic path is to be computed. With most state of the art navigation systems providing the uncertainty in motion estimation, here we propose to exploit this information. This leads to a system that can plan safe avoidance of obstacles, and more importantly, it can actively aid navigation by choosing a path that minimizes the uncertainty in the monitored states. Our proposed approach is applicable to systems requiring certain excitations in order to render all their states observable, such as a MAV with visual-inertial based localization. In this work, we propose an approach which takes into account this necessary motion during path planning: by employing Rapidly exploring Random Belief Trees (RRBT), the proposed approach chooses a path to a goal which allows for best estimation of the robot’s states, while inherently avoiding motion in unobservable modes. We discuss our findings within the scenario of vision-based aerial navigation as one of the most challenging navigation problem, requiring sufficient excitation to reach full observability.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201305_ICRA_ETHZ.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.icra2013.org/</web_url>
<event_place>Karlsruhe, Germany</event_place>
<event_name>IEEE International Conference on Robotics and Automation</event_name>
<authors>
<person><fn>M. W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>techreport</bibtype>
<citeid>RozantsevCLF2013</citeid>
<title>Detection of Aircrafts on a Collision Course using Spatio-Temporal HOG</title>
<year>2013</year>
<month>3</month>
<day>5</day>
<abstract>We have developed a method for the detection of both generic
flight neighbouring aircrafts and those on a collision course. Our approach employs a sliding window linear Support Vector Machine (SVM) classifier with a Histogram of Oriented Gradients (HOG) feature representation. An
extension of this approach to the spatio-temporal domain is also considered and we demonstrate its advantage for the detection of aircrafts on a collision path. We evaluated our approach for the detection of both small rotorcraft and larger fixed-wing aircrafts in challenging video sequences. Our results show that aircrafts on a collision course can be detected more reliably than when assuming a generic flight path. This is very interesting in practice, since this case is of critical importance. We also show that our spatio-temporal approach improves the detection accuracy with respect to conventional single-frame approaches.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://infoscience.epfl.ch/record/184879</web_url>
<institution>Ecole Polytechnique Federale de Lausanne</institution>
<authors>
<person><fn>A.</fn><sn>Rozantsev</sn></person>
<person><fn>C.M.</fn><sn>Christoudias</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>TrzcinskiCFL2013</citeid>
<title>Boosting Binary Keypoint Descriptors</title>
<journal>26th IEEE Conference on Computer Vision and Pattern Recognition, Portland, Oregon, USA</journal>
<year>2013</year>
<month>6</month>
<day>23</day>
<abstract>Binary keypoint descriptors provide an efficient alternative
to their floating-point competitors as they enable faster
processing while requiring less memory. In this paper, we
propose a novel framework to learn an extremely compact
binary descriptor we call BinBoost that is very robust to
illumination and viewpoint changes. Each bit of our descriptor is computed with a boosted binary hash function,
and we show how to efficiently optimize the different hash
functions so that they complement each other, which is key
to compactness and robustness. The hash functions rely on
weak learners that are applied directly to the image patches, which frees us from any intermediate representation and lets us automatically learn the image gradient pooling configuration of the final descriptor. Our resulting descriptor significantly outperforms the state-of-the-art binary descriptors and performs similarly to the best floating-point descriptors at a fraction of the matching time and memory footprint.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201306_CVPR_EPFL.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.pamitc.org/cvpr13/</web_url>
<authors>
<person><fn>T.</fn><sn>Trzcinski</sn></person>
<person><fn>M.</fn><sn>Christoudias</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>Bieg2013</citeid>
<title>Asymmetric saccade initiation at smooth pursuit onset</title>
<journal>23rd Oculomotor Meeting 2013, Linz, Austria</journal>
<year>2013</year>
<month>1</month>
<day>26</day>
<note>23rd Oculomotor Meeting 2013, University of Applied Sciences
Upper Austria</note>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.fh-ooe.at/kongresswesen/konferenzen-kongresse/2013/23rd-oculomotor-meeting-2013/</web_url>
<authors>
<person><fn>H.-J.</fn><sn>Bieg</sn></person>
<person><fn>L.L.</fn><sn>Chuang</sn></person>
<person><fn>J.-P.</fn><sn>Bresciani</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Jump2013</citeid>
<title>Personal Aerial Transport Systems: A Flight of Fancy or a Realistic Goal?</title>
<year>2013</year>
<month>11</month>
<day>11</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://aerosociety.com/Events/Event-List/918/Light-Aircraft-Design-Methods-and-Tools-2013</web_url>
<institution>RAeS</institution>
<event_place>London, UK</event_place>
<event_name>Royal Aeronautical Society Light Aircraft Design: Methods and Tools 2013</event_name>
<authors>
<person><fn>M.</fn><sn>Jump</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2013_5</citeid>
<title>The Cybernetics of Aerial Machines: From Perception and Action for Aerial Robots to a Transport System based on Personal Aerial Vehicles</title>
<year>2013</year>
<month>11</month>
<day>11</day>
<abstract>Our brain is constantly processing a vast amount of sensory and intrinsic information in order to understand and interact with the world around us. In my department at the Max Planck Institute for Biological Cybernetics in Tübingen and also in my research group in the Biological Cybernetics Lab at Korea University we aim to best model human perception and action and to test these models to predict human action for example in the context of driving and flying. To this end, we use systems and control theory, computer vision, and psychophysical techniques while conducting experiments with the most advanced state of the art motion simulators. I will present two examples to illustrate our research philosophy, the first in the area of Telepresence and the second about the enabling technologies of futuristic transportations systems: (1) An ideal telepresence system should enable the user to perceive and act on the remote environment as if sensed directly. In this context, we study new ways to interface human operators and teams of autonomous remote robots in a shared bilateral control architecture. (2) A novel framework to overcome the congestion problems with current ground-based transportation is a personal air transport system (PATS). In the myCopter project (www.mycopter.eu) we study together with other European partners the enabling technologies for traveling between homes and working places, and for flying in swarms at low altitude in urban environments. All our efforts are guided by the accepted vision that in the future humans and machines will seamlessly cooperate in shared or remote spaces, thus becoming an integral part of our daily life. For instance, robots or vehicles should be able to autonomously reason about their remote environment, i.e., to possess a significant level of autonomy in order to perform local tasks and take decisions.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Keynote Lecture</talk_type>
<web_url>http://ta2013.yonsei.ac.kr/TA2013/TA2013.html</web_url>
<event_place>Seoul, South Korea</event_place>
<event_name>3rd IFAC Symposium on Telematics Applications (TA 2013)</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>BulthoffN2013</citeid>
<title>MYCOPTER: Enabling Technologies for Personal Air-Transport Systems</title>
<year>2013</year>
<month>9</month>
<day>27</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<web_url>http://www.airtn.eu/events/archive/airtn-forum.html</web_url>
<event_place>Cranfield, UK</event_place>
<event_name>AirTN Forum</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2013</citeid>
<title>Und wenn wir einfach zur Arbeit fliegen?</title>
<year>2013</year>
<month>6</month>
<day>23</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Kelheim, Germany</event_place>
<event_name>Heliday 2013</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2013_3</citeid>
<title>Neue Konzepte für Autopiloten durch wahrnehmungsbasierte Flugsimulationen</title>
<year>2013</year>
<month>6</month>
<day>22</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<event_place>Giebelstadt, Germany</event_place>
<event_name>Rollout des neuen Fama-Jetkopters K209</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2013_4</citeid>
<title>Cyberneum reloaded: Virtual Reality and Simulation Research</title>
<year>2013</year>
<month>6</month>
<day>20</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Max Planck Campus, Tübingen, Germany</event_place>
<event_name>Opening of the new Cyberneum building</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>TrzcinskiCLF2012</citeid>
<title>Learning Image Descriptors with the Boosting-Trick</title>
<year>2012</year>
<month>12</month>
<day>6</day>
<pages>1-9</pages>
<abstract>In this paper we apply boosting to learn complex non-linear local visual feature representations, drawing inspiration from its successful application to visual object detection. The main goal of local feature descriptors is to distinctively represent a salient image region while remaining invariant to viewpoint and illumination changes. This representation can be improved using machine learning, however, past approaches have been mostly limited to learning linear feature mappings
in either the original input or a kernelized input feature space. While kernelized methods have proven somewhat effective for learning non-linear local feature descriptors, they rely heavily on the choice of an appropriate kernel function whose selection is often difficult and non-intuitive. We propose to use the boosting-trick to obtain a non-linear mapping of the input to a high-dimensional feature space. The non-linear feature mapping obtained with the boosting-trick is highly intuitive. We employ gradient-based weak learners resulting in a learned descriptor that closely resembles the well-known SIFT. As demonstrated in our experiments,
the resulting descriptor can be learned directly from intensity patches achieving state-of-the-art performance.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201212_NIPS2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://nips.cc/Conferences/2012/</web_url>
<event_place>Lake Tahoe, Nevada</event_place>
<event_name>Neural Information Processing Systems 2012</event_name>
<authors>
<person><fn>T.</fn><sn>Trzcinski</sn></person>
<person><fn>M.</fn><sn>Christoudias</sn></person>
<person><fn>V.</fn><sn>Lepetit</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>DropPDvBM2012</citeid>
<title>Identification of the transition from compensatory to feedforward behavior in manual control</title>
<year>2012</year>
<month>10</month>
<day>15</day>
<pages>2008-2013</pages>
<abstract>The human in manual control of a dynamical system can use both feedback and feedforward control strategies and will select a strategy based on performance and required effort. Literature has shown that feedforward control is used during tracking tasks in response to predictable targets. The influence of an external disturbance signal on the utilization of a feedforward control strategy has never been investigated, however. We hypothesized that the human will use a combined feedforward and feedback control strategy whenever the predictable target signal is sufficiently strong, and a predominantly feedback strategy whenever the random disturbance signal is dominant. From the data of a human-in-the-loop experiment we conclude that feedforward control is used in all the considered experimental conditions, including those where the disturbance signal is dominant and feedforward control does not deliver a marked performance advantage.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201210_SMC_Drop.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6378033</web_url>
<publisher>IEEE</publisher>
<event_place>Seoul, Korea</event_place>
<event_name>IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2012</event_name>
<ISBN>978-1-4673-1713-9</ISBN>
<DOI>10.1109/ICSMC.2012.6378033</DOI>
<authors>
<person><fn>F.M.</fn><sn>Drop</sn></person>
<person><fn>D.M.</fn><sn>Pool</sn></person>
<person><fn>H.J.</fn><sn>Damveld</sn></person>
<person><fn>M.M.</fn><sn>van Paassen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>VenrooijMvAvMB2012</citeid>
<title>How effective is an armrest in mitigating biodynamic feedthrough?</title>
<year>2012</year>
<month>10</month>
<pages>2150-2155</pages>
<abstract>Biodynamic feedthrough (BDFT) refers to a phenomenon where vehicle accelerations feed through the human body, causing involuntary limb motions, which may cause involuntary control inputs. Many studies have been devoted to mitigating BDFT effects. In the current paper, the effectiveness of a simple, cheap and widely-used hardware component is studied: the armrest. An experiment was conducted in which the BDFT dynamics were measured with and without armrest for different levels of neuromuscular admittance (i.e., different settings of the limb dynamics). The results show that the effect of the armrest on BDFT dynamics varies, both with frequency and neuromuscular admittance.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201210_SMC2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.smc2012.org/</web_url>
<event_place>Seoul, Korea</event_place>
<event_name>IEEE International Conference on Systems, Man, and Cybernetics (SMC 2012)</event_name>
<DOI>10.1109/ICSMC.2012.6378058</DOI>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>M. M.</fn><sn>van Paassen</sn></person>
<person><fn>D. A.</fn><sn>Abbink</sn></person>
<person><fn>F. C. T.</fn><sn>van der Helm</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>AchtelikLWKCS2012</citeid>
<title>Visual-Inertial SLAM for a Small Helicopter in Large Outdoor Environments</title>
<year>2012</year>
<month>10</month>
<pages>2651-2652</pages>
<abstract>In this video, we present our latest results towards fully autonomous flights with a small helicopter. Using a monocular camera as the only exteroceptive sensor, we fuse inertial measurements to achieve a self-calibrating power-onand- go system, able to perform autonomous flights in previously unknown, large, outdoor spaces. Our framework achieves Simultaneous Localization And Mapping (SLAM) with previously unseen robustness in onboard aerial navigation for small platforms with natural restrictions on weight and computational power. We demonstrate successful operation in flights with altitude between 0.2-70 m, trajectories with 350 m length, as well as dynamic maneuvers with track speed of 2 m/s. All flights shown are performed autonomously using vision in the loop, with only high-level waypoints given as directions.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201210_IROS2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.iros2012.org/site/</web_url>
<event_place>Vilamoura, Portugal</event_place>
<event_name>IEEE/RSJ International Conference on Intelligent Robots and Systems 2012</event_name>
<DOI>10.1109/IROS.2012.6386270</DOI>
<authors>
<person><fn>M. W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Lynen</sn></person>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>L.</fn><sn>Kneip</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>VenrooijPMvB2012</citeid>
<title>A practical biodynamic feedthrough model for helicopters</title>
<year>2012</year>
<month>9</month>
<pages>842-854</pages>
<abstract>Biodynamic feedthrough (BDFT) occurs when vehicle vibrations and accelerations feed through the pilot’s body and cause involuntary motion of limbs, resulting in involuntary control inputs. BDFT can severely reduce ride comfort, control accuracy and, above all, safety during the operation of rotorcraft. Furthermore, BDFT can cause and sustain Rotorcraft-Pilot Couplings (RPCs). Despite many studies conducted in past decades – both within and outside of the rotorcraft community – BDFT is still a poorly understood phenomenon. The complexities involved in BDFT have kept researchers and manufacturers in the rotorcraft domain from developing robust ways of dealing with its effects. A practical BDFT pilot model, describing the amount of involuntary control inputs as a function of accelerations, could pave the way to account for adversive BDFT effects. In the current paper, such a model is proposed. Its structure is based on the model proposed by Mayo [1], its accuracy and usability are improved by incorporating insights from recently obtained experimental data. An evaluation of the model performance shows that the model describes the measured data well and that it provides a considerable improvement to the original Mayo model. Furthermore, the results indicate that the neuromuscular dynamics have an important influence on the BDFT model parameters.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201209_ERF2012_MPI.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://erf2012.nlr.nl/</web_url>
<event_place>Amsterdam, Netherlands</event_place>
<event_name>38th European Rotorcraft Forum (ERF)</event_name>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>M. D.</fn><sn>Pavel</sn></person>
<person><fn>M.</fn><sn>Mulder</sn></person>
<person><fn>F. C. T.</fn><sn>van der Helm</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PerfectJW2012</citeid>
<title>Development of Handling Qualities Requirements for a Personal Aerial Vehicle</title>
<year>2012</year>
<month>9</month>
<pages>1-18</pages>
<abstract>This paper describes progress that has been made in the European Commission funded project myCopter on the development of handling qualities requirements for future Personal Aerial Vehicles (PAVs). A generic PAV dynamics model has been developed to permit the simulation of a range of tasks that are representative of a typical PAV commuting role. The model has been configured to provide a number of different response types – with a constant control deflection commanding a constant angular rate, a constant attitude or a constant translational rate. Results from simulation trials with test pilots have shown that, of the response types investigated, the translational rate response is most suited for PAV pilots flying low speed tasks. Ongoing work will identify whether the response types selected with the test pilots remain valid for pilots with reduced levels of training – more akin to those expected for future PAV operations.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201209_ERF2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://erf2012.nlr.nl/</web_url>
<event_place>Amsterdam, Netherlands</event_place>
<event_name>38th European Rotorcraft Forum</event_name>
<authors>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>M. D.</fn><sn>White</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>OlivariNVBP2012</citeid>
<title>Multi-loop Pilot Behaviour Identification in Response to Simultaneous Visual and Haptic Stimuli</title>
<year>2012</year>
<month>8</month>
<day>13</day>
<pages>892-914</pages>
<abstract>The goal of this paper is to better understand how the neuromuscular system of a pilot, or more generally an operator, adapts itself to different types of haptic aids during a pitch control task. A multi-loop pilot model, capable of describing the human behaviour during a tracking task, is presented. Three different identiffcation techniques were investigated in order to simultaneously identify neuromuscular admittance and the visual response of a human pilot. In one of them, the various frequency response functions that build up the pilot model are identified using multi-inputs linear time-invariant models in ARX form. A second method makes use of cross-spectral densities and diagram block algebra to obtain the desired frequency response estimates. The identification techniques were validated using Monte Carlo simulations of a closed-loop control task. Both techniques were compared with the results of another identification method well known in literature and based on cross-spectral density estimates. All those methods were applied in an experimental setup in which pilots performed a pitch control task with different haptic aids. Two diferent haptic aids for tracking task are presented, a Direct Haptic Aid and an Indirect Haptic Aid. The two haptic aids were compared with a baseline condition in which no haptic force was used. The data obtained with the proposed method provide insight in how the pilot adapts his control behavior in relation to different haptic feedback schemes. From the experimental results it can be concluded that humans adapt their neuromuscular admittance in relation with different haptic aids. Furthermore, the two new identification techniques seemed to give more reliable admittance estimates.</abstract>
<note>http://mycopter.eu/typo3/ext/dam/mod_file/fileadmin/mycopter_user_upload/files/Downloads/Publications/201208_MST2012.pdf</note>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201208_MST2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.aiaa.org</web_url>
<event_place>Minneapolis, Minnesota</event_place>
<event_name>AIAA Modeling and Simulation Technologies Conference</event_name>
<authors>
<person><fn>M.</fn><sn>Olivari</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>L.</fn><sn>Pollini</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>BiegBC2012</citeid>
<title>Einfluss von Ablenkung und Augenbewegungen auf Steuerungsaufgaben</title>
<year>2012</year>
<month>8</month>
<pages>341-344</pages>
<abstract>In der vorliegenden Studie wurde der Einfluss visueller Ablenkung auf Steuerungsaufgaben untersucht. Die Ergebnisse deuten darauf hin, dass bereits eine kurze Verlagerung der Aufmerksamkeit und des Blicks mit einer systematischen Beeinflussung der Steuerungsaufgabe einhergeht. Im Gegenzug findet auch eine systematische Beeinflussung der Augenbewegungen durch die gleichzeitig durchgeführte Steuerungsaufgabe statt. Die Berücksichtigung solcher Interferenzen kann bei der Entwicklung von grafischen On-Board-Informationssystemen für Fahr- oder Flugzeuge von Nutzen sein.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_name>Mensch &amp; Computer 2012: 12. fachübergreifende Konferenz für interaktive und kooperative Medien, Mensch &amp; Computer (M&amp;C), Oldenbourg, München, Germany</event_name>
<authors>
<person><fn>H.-J.</fn><sn>Bieg</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>AchtelikWLCS2012</citeid>
<title>Vision-based MAV Navigation: Implementation Challenges Towards a Usable System in Real-Life Scenarios</title>
<year>2012</year>
<month>7</month>
<day>9</day>
<pages>1-2</pages>
<abstract>In this workshop, we share our experience on autonomous flights with a MAV using a monocular camera as the only exteroceptive sensor. We developed a flexible and modular
framework for realtime onboard MAV navigation fusing visual data with IMU at a rate of 1 kHz. We aim to provide a detailed insight of how our modules work and how they are used in our system. Using this framework, we achieved unprecedented MAV navigation autonomy, with flights of more than 350 m length at altitudes between 0 m and 70 m, in previously unknown areas while global signals as GPS were not used.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201207_RSS2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://roboticsconference.org</web_url>
<booktitle>Workshop on Integration of perception with control and navigation for resource-limited, highly dynamic, autonomous systems</booktitle>
<event_place>Sydney, Australia</event_place>
<event_name>Robotics: Science and Systems</event_name>
<authors>
<person><fn>M. W.</fn><sn>Achtelik</sn></person>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>S.</fn><sn>Lymen</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>WeissACS2012</citeid>
<title>Versatile Distributed Pose Estimation and Sensor Self-Calibration for Autonomous MAVs</title>
<year>2012</year>
<month>5</month>
<day>14</day>
<pages>1-8</pages>
<abstract>In this paper, we present a versatile framework to enable autonomous flights of a Micro Aerial Vehicle (MAV) which has only slow, noisy, delayed and possibly arbitrarily scaled measurements available. Using such measurements directly for position control would be practically impossible as MAVs exhibit great agility in motion. In addition, these measurements often come from a selection of different onboard sensors, hence accurate calibration is crucial to the robustness of the estimation processes. Here, we address these problems using an EKF formulation which fuses these measurements with inertial sensors. We do not only estimate pose and velocity of the MAV, but also estimate sensor biases, scale of the position measurement and self (inter-sensor) calibration in real-time. Furthermore, we show that it is possible to obtain a yaw estimate from position measurements only. We demonstrate that the proposed framework is capable of running entirely onboard a MAV performing state prediction at the rate of 1 kHz. Our results illustrate that this approach is able to handle measurement delays (up to 500ms), noise (std. deviation up to 20 cm) and slow update rates (as low as 1 Hz) while dynamic maneuvers are still possible. We present a detailed quantitative performance evaluation of the real system under the influence of different disturbance parameters and different sensor setups to highlight the versatility of our approach.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201205_ICRA2012.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.icra2012.org/</web_url>
<event_place>St. Paul (MN)</event_place>
<event_name>IEEE International Conference on Robotics and Automation (ICRA)</event_name>
<DOI>10.1109/ICRA.2012.6225002</DOI>
<authors>
<person><fn>S.</fn><sn>Weiss</sn></person>
<person><fn>M. W.</fn><sn>Achtelik</sn></person>
<person><fn>M.</fn><sn>Chli</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>MagnenatCM2012</citeid>
<title>Integration of online learning into HTN planning for robotic tasks</title>
<year>2012</year>
<month>3</month>
<day>26</day>
<pages>1-6</pages>
<abstract>This paper extends HTN planning with lightweight learning, considering that in robotics, actions have a non-zero probability of failing. Our work applies to A*-based HTN planners with lifting. We prove that the planner finds the plan of maximal expected utility, while retaining its lifting capability and efficient heuristic-based search. We show how to learn the probabilities online, which allows a robot to adapt by replanning on execution failures. The idea behind this work is to use the HTN domain to constrain the space of possibilities, and then to learn on the constrained space in a way requiring few training samples, rendering the method applicable to autonomous mobile robots.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/2012_ETHZ_lonelybuilder-reintegrating-ai.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.aaai.org</web_url>
<web_url2>http://people.csail.mit.edu/gdk/dir/</web_url2>
<event_place>Stanford University</event_place>
<event_name>AAAI Spring Symposium 2012</event_name>
<authors>
<person><fn>S.</fn><sn>Magnenat</sn></person>
<person><fn>J.-C.</fn><sn>Chappelier</sn></person>
<person><fn>F.</fn><sn>Mondada</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>ChuangNB2013</citeid>
<title>Investigating Gaze Behavior of Novice Pilots during Basic Flight Maneuvers</title>
<journal>International Conference on Human-Computer Interaction in Aerospace, Brussels, Belgium</journal>
<year>2012</year>
<month>9</month>
<day>12</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://research.fit.edu/hci-aero/HCI-Aero2012/Home.html</web_url>
<authors>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>BiegBBC2012</citeid>
<title>Asymmetries in saccadic latencies during interrupted ocular pursuit</title>
<journal>35th European Conference on Visual Perception, Alghero, Italy, Perception</journal>
<year>2012</year>
<month>9</month>
<volume>41 (ECVP Abstract Supplement)</volume>
<pages>137</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<authors>
<person><fn>H.-J.</fn><sn>Bieg</sn></person>
<person><fn>J.-P.</fn><sn>Bresciani</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
</authors>
</reference>
<reference>
<bibtype>poster</bibtype>
<citeid>ChuangNB2012</citeid>
<title>Eye-movement planning during flight maneuvers</title>
<journal>35th European Conference on Visual Perception, Alghero, Italy, Perception</journal>
<year>2012</year>
<month>9</month>
<volume>41 (ECVP Abstract Supplement)</volume>
<pages>99</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<authors>
<person><fn>L. L.</fn><sn>Chuang</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2013_2</citeid>
<title>Flying Robots and Flying Cars</title>
<year>2012</year>
<month>12</month>
<day>12</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://rslab.kaist.ac.kr/xe/?document_srl=626#0</web_url>
<event_place>Korea Advanced Institute of Science and Technology: Robotics and Simulation Laboratory, Daejeon, South Korea</event_place>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Nieuwenhuizen2012</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems: A progress report</title>
<year>2012</year>
<month>11</month>
<day>7</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Frankfurt am Main, Germany</event_place>
<event_name>HELI World conference 2012</event_name>
<authors>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>NieuwenhuizenB2012</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2012</year>
<month>10</month>
<day>24</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Brussels, Belgium</event_place>
<event_name>Joint EU–US Workshop on Small Aircraft and Personal Planes Systems</event_name>
<authors>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012</citeid>
<title>A Cybernetics Approach to Perception and Action</title>
<year>2012</year>
<month>10</month>
<day>15</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Keynote Lecture</talk_type>
<event_place>Seoul, Korea</event_place>
<event_name>IEEE International Conference on Systems, Man, and Cybernetics (SMC 2012)</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012_3</citeid>
<title>The MPI View on Shared Control</title>
<year>2012</year>
<month>10</month>
<day>14</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Seoul, Korea</event_place>
<event_name>SMC 2012 Workshop on Shared Control</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012_4</citeid>
<title>Flying Robots and Flying Cars</title>
<year>2012</year>
<month>10</month>
<day>10</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<event_place>Seoul, Korea</event_place>
<event_name>College of Information and Communications: Korea University</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>JFH2012</citeid>
<title>What if we simply fly to work? myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2012</year>
<month>9</month>
<day>26</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_name>Deutsch-Italienische Handelskammer: Workshop zur Investorengewinnung, Vicenza, Italy</event_name>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>VenrooijNB2013</citeid>
<title>What if we simply fly to work? myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2012</year>
<month>9</month>
<day>25</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_name>Deutsch-Italienische Handelskammer: Workshop zur Investorengewinnung, Torino, Italy</event_name>
<authors>
<person><fn>J.</fn><sn>Venrooij</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>BulthoffN2012</citeid>
<title>Und wenn wir einfach zur Arbeit fliegen?</title>
<year>2012</year>
<month>9</month>
<day>18</day>
<abstract>Welche Bedingungen sind nötig, um fliegende Autos in unserer Gesellschaft zu etablieren?

Ein allmorgendliches Szenario: Stau auf den Autobahnen, die Hauptverkehrsstraßen der Städte sind verstopft, Züge und Busse sind hoffnungslos überfüllt. Der Pendlerverkehr ist längst an seine Grenzen gestoßen und Abhilfe kann der Ausbau des bestehenden Verkehrsnetzes nur noch bedingt schaffen. Vielerorts fehlt es einfach an dem benötigten Platz für neue Straßen und auch die Instandhaltung bestehender kostet schon Unsummen. Doch wie sehen die Alternativen aus? Ganz einfach: Der Individualverkehr hebt ab in die dritte Dimension! Diese Vision verfolgt Prof. Heinrich Bülthoff vom Max-Planck-Institut für biologische Kybernetik in Tübingen mit dem EU-Projekt „myCopter“. Ziel ist nicht, ein fliegendes Auto zu bauen, sondern vielmehr die technischen und gesellschaftlichen Bedingungen zu klären, unter denen diese zu einem von der Gesellschaft akzeptierten und brauchbaren Verkehrsmittel werden könnten. Damit wird – in hoffentlich nicht allzu ferner Zukunft - unser Weg zur Arbeit wieder entspannter sein.

Zum Konsortium gehören neben dem MPI für biologische Kybernetik, die Universität Liverpool, die École Polytechnique in Lausanne, die ETH Zürich, das Karlsruher Institut für Technologie und das Deutsche Zentrum für Luft- und Raumfahrt.</abstract>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<web_url>http://www.gdnae.de/</web_url>
<event_place>Göttingen, Germany</event_place>
<event_name>127. Versammlung Gesellschaft Deutscher Naturforscher und Ärzte (GDNÄ)</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
<person><fn>F. M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Jump2012</citeid>
<title>The Flying Car</title>
<year>2012</year>
<month>7</month>
<day>10</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://www.youtube.com/watch?v=6NjetSo-Mxk</web_url>
<event_name>Vauxhall Ampera Forethought x 5x15, King's Cross Filling Station, London, UK</event_name>
<authors>
<person><fn>M.</fn><sn>Jump</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012_6</citeid>
<title>The Cybernetic Approach to Perception and Action</title>
<year>2012</year>
<month>5</month>
<day>23</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<event_place>Universität Bielefeld, Bielefeld, Germany</event_place>
<event_name>CITEC Colloquium "Vision Science"</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Decker2012</citeid>
<title>Flying to work? Take a MyCopter!</title>
<year>2012</year>
<month>5</month>
<day>10</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<event_place>London, UK</event_place>
<event_name>13th EU Hitachi Science and Technology Forum "Transport and Mobility towards 2050"</event_name>
<authors>
<person><fn>M.</fn><sn>Decker</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012_5</citeid>
<title>myCopter: Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2012</year>
<month>3</month>
<day>7</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Abu Dhabi, United Arab Emirates</event_place>
<event_name>Abu Dhabi Air Expo: Helicopter Conference</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2012_2</citeid>
<title>Flying Robots and Flying Cars</title>
<year>2012</year>
<month>3</month>
<day>1</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<web_url>http://expertdays.schunk.com/index.php?id=49</web_url>
<event_place>Hausen, Germany</event_place>
<event_name>5th Schunk International Expertdays: Service Robotics</event_name>
<authors>
<person><fn>H. H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>NieuwenhuizenJPW2011</citeid>
<title>myCopter – Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2011</year>
<month>11</month>
<day>4</day>
<pages>1-8</pages>
<abstract>Current road transportation systems throughout the European Union suffer from severe congestion problems. A solution can be to move towards a Personal Aerial Transportation System, in which vehicles would also have vertical space at their disposal. In the myCopter project, funded by the European Union under the 7th Framework Programme, the viability of such a system will be investigated. It is argued that this should be done by taking into account the required operational infrastructure, instead of starting with the design of a vehicle. By investigating human-machine
interfaces and training, automation technologies, and socio-economic impact, the myCopter project aims to provide a basis for a transportation system based on Personal Aerial Vehicles. In this paper, an outline of the project is given. Early research results are detailed and provide a basis for the remainder of the project.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201111_HeliWorld2011.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Frankfurt, Germany</event_place>
<event_name>3rd International HELI World Conference 2011</event_name>
<digital>1</digital>
<authors>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>G.D.</fn><sn>Padfield</sn></person>
<person><fn>D.</fn><sn>Floreano</sn></person>
<person><fn>F.</fn><sn>Schill</sn></person>
<person><fn>J.-C.</fn><sn>Zufferey</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
<person><fn>S.</fn><sn>Bouabdallah</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
<person><fn>S.</fn><sn>Meyer</sn></person>
<person><fn>J.</fn><sn>Schippl</sn></person>
<person><fn>M.</fn><sn>Decker</sn></person>
<person><fn>B.</fn><sn>Gursky</sn></person>
<person><fn>M.</fn><sn>Höfinger</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>PomerleauMCLS2011</citeid>
<title>Tracking a Depth Camera: Parameter Exploration for Fast ICP</title>
<year>2011</year>
<month>9</month>
<day>29</day>
<pages>3824–3829</pages>
<abstract>The increasing number of ICP variants leads to an explosion of algorithms and parameters. This renders difficult the selection of the appropriate combination for a given application. In this paper, we propose a state-of-the-art,
modular, and efficient implementation of an ICP library. We
took advantage of the recent availability of fast depth cameras to demonstrate one application example: a 3D pose tracker running at 30 Hz. For this application, we show the modularity of our ICP library by optimizing the use of lean and simple descriptors in order to ease the matching of 3D point clouds. This tracker is then evaluated using datasets recorded along a ground truth of millimeter accuracy. We provide both source code and datasets to the community in order to accelerate further comparisons in this field.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201109_ETHZ_tracker_iros.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>San Francisco, CA</event_place>
<event_name>IEEE/RSJ International Conference on Intelligent Robots and Systems 2011</event_name>
<DOI>10.1109/IROS.2011.6094861</DOI>
<authors>
<person><fn>F.</fn><sn>Pomerleau</sn></person>
<person><fn>S.</fn><sn>Magnenat</sn></person>
<person><fn>F.</fn><sn>Colas</sn></person>
<person><fn>M.</fn><sn>Liu</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>JumpPPW2011</citeid>
<title>myCopter: Enabling Technologies for Personal Air Transport Systems - An Early Progress Report</title>
<year>2011</year>
<month>9</month>
<day>13</day>
<pages>1-12</pages>
<abstract>This paper describes the European Commission (EC) Framework 7 funded project myCopter (2011-2014). The project is still at an early stage so the paper starts with a discussion of the current transportation issues faced, for example, by European countries and describes a means to solve them through the use of a personal aerial transportation system (PATS). The concept of personal air vehicles (PAVs) is briefly reviewed and how this project intends to tackle the problem described. It is argued that the key reason that many PAV concepts have failed is because the operational infrastructure and socio-economic issues have not been properly addressed; rather, the start point has been the design of the vehicle itself. Some of the key aspects that would make a PATS viable include the required infrastructure and associated technologies, the skill levels and machine interfaces needed by the occupant or pilot and the views of society as a whole on the acceptability of such a proposition. The myCopter project will use these areas to explore the viability of PAVs within a PATS. The paper reports upon the early progress made within the project. An initial reference set of PAV requirements has been collated. A conceptual flight simulation model capable of providing a wide range of handling qualities characteristics has been developed and its function has undergone limited verification. Results from this exercise show that the model behaves as intended and that it can deliver a predictable range of vehicle dynamics. The future direction of the project is then described.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201109_ERF2011.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Gallerate, Italy</event_place>
<event_name>37th European Rotorcraft Forum</event_name>
<digital>1</digital>
<authors>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>P.</fn><sn>Perfect</sn></person>
<person><fn>G.D.</fn><sn>Padfield</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>D.</fn><sn>Floreano</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
<person><fn>J.-C.</fn><sn>Zufferey</sn></person>
<person><fn>F.</fn><sn>Schill</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
<person><fn>S.</fn><sn>Bouabdallah</sn></person>
<person><fn>M.</fn><sn>Decker</sn></person>
<person><fn>J.</fn><sn>Schippl</sn></person>
<person><fn>S.</fn><sn>Meyer</sn></person>
<person><fn>M.</fn><sn>Höfinger</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inproceedings</bibtype>
<citeid>JumpPWF2011</citeid>
<title>myCopter: Enabling Technologies for Personal Air Transport Systems</title>
<year>2011</year>
<month>6</month>
<day>16</day>
<pages>1-15</pages>
<abstract>This paper describes the European Commission Framework 7 funded project myCopter (2011-2014). The project is still at an early stage so the paper starts with the current transportation issues faced by developed countries and describes a means to solve them through the use of personal aerial transportation. The concept of personal air vehicles (PAV) is briefly reviewed and how this project intends to tackle the problem from a different perspective described. It is argued that the key reason that many PAV concepts have failed is because the operational infrastructure and socio-economic issues have not been properly addressed; rather, the start point has been the design of the vehicle itself. Some of the key aspects that would make a personal aerial transport system (PATS) viable include the required infrastructure and associated technologies, the skill levels and machine interfaces needed by the occupant or pilot and the views of society as a whole on the acceptability of such a proposition. The myCopter project will use these areas to explore the viability of PAVs within a PATS. The paper provides an overview of the project structure, the roles of the partners, and hence the available research resources, and some of the early thinking on each of the key project topic areas.</abstract>
<file_url>http://mycopter.euhttp://www.mycopter.eu/fileadmin/mycopter_user_upload/files/Downloads/Publications/201106_RAeS_future_rotorcraft.pdf</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<publisher>Royal Aeronautical Society</publisher>
<address>London, UK</address>
<event_place>London, UK</event_place>
<event_name>The Future Rotorcraft:   Enabling capability through the application of technology</event_name>
<digital>1</digital>
<authors>
<person><fn>M.</fn><sn>Jump</sn></person>
<person><fn>G.D.</fn><sn>Padfield</sn></person>
<person><fn>M.D.</fn><sn>White</sn></person>
<person><fn>D.</fn><sn>Floreano</sn></person>
<person><fn>P.</fn><sn>Fua</sn></person>
<person><fn>J.-C.</fn><sn>Zufferey</sn></person>
<person><fn>F.</fn><sn>Schill</sn></person>
<person><fn>R.</fn><sn>Siegwart</sn></person>
<person><fn>S.</fn><sn>Bouabdalla</sn></person>
<person><fn>M.</fn><sn>Decker</sn></person>
<person><fn>J.</fn><sn>Schippl</sn></person>
<person><fn>M.</fn><sn>Höfinger</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>inbook</bibtype>
<citeid>Decker2011</citeid>
<title>myCopter: Enabling Technologies for Personal Air Transport Systems</title>
<year>2011</year>
<month>4</month>
<day>1</day>
<pages>107-108</pages>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<editor>ITAS</editor>
<publisher>ITAS</publisher>
<address>Karlsruhe</address>
<booktitle>Technikfolgenabschätzung -­ Theorie und Praxis</booktitle>
<authors>
<person><fn>M.</fn><sn>Decker</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2011</citeid>
<title>Science and Science Fiction: closing the loop between Perception and Technology</title>
<year>2011</year>
<month>10</month>
<day>5</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Keynote Lecture</talk_type>
<event_place>Seoul, Korea</event_place>
<event_name>Department of Brain and Cognitive Engineering, Korea University</event_name>
<authors>
<person><fn>Heinrich</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>Bulthoff2011_2</citeid>
<title>Science and Science Fiction: closing the loop between Cognition and Application</title>
<year>2011</year>
<month>6</month>
<day>20</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<talk_type>Invited Lecture</talk_type>
<event_place>Università degli Studi di Genova, Genova, Italy</event_place>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
</authors>
</reference>
<reference>
<bibtype>conference</bibtype>
<citeid>BulthoffN2011</citeid>
<title>myCopter - Enabling Technologies for Personal Aerial Transportation Systems</title>
<year>2011</year>
<month>3</month>
<day>31</day>
<file_url>http://mycopter.eu</file_url>
<file_url2>http://mycopter.eu</file_url2>
<file_url3>http://mycopter.eu</file_url3>
<event_place>Madrid, Spain</event_place>
<event_name>6th European Aerodays 2011</event_name>
<authors>
<person><fn>H.H.</fn><sn>Bülthoff</sn></person>
<person><fn>F.M.</fn><sn>Nieuwenhuizen</sn></person>
</authors>
</reference>
</sevenpack>
