{"id":774,"date":"2019-12-04T01:25:49","date_gmt":"2019-12-04T01:25:49","guid":{"rendered":"http:\/\/ming3d.com\/VR\/?p=774"},"modified":"2020-02-16T17:34:19","modified_gmt":"2020-02-16T17:34:19","slug":"eye-tracking-workshop-2-data-interpretation","status":"publish","type":"post","link":"https:\/\/ming3d.com\/VR\/2019\/12\/04\/eye-tracking-workshop-2-data-interpretation\/","title":{"rendered":"Eye Tracking Technology"},"content":{"rendered":"<p><a href=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4.jpeg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-795 size-large\" src=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4-1030x506.jpeg\" alt=\"\" width=\"1030\" height=\"506\" srcset=\"https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4-1030x506.jpeg 1030w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4-300x147.jpeg 300w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4-705x347.jpeg 705w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4-450x221.jpeg 450w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/Eye-Tracking_4.jpeg 1251w\" sizes=\"auto, (max-width: 1030px) 100vw, 1030px\" \/><\/a><\/p>\n<h2>1. Wearable Eye-tracking technology.<\/h2>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-775\" src=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/student_ET-495x400.jpg\" alt=\"student_ET\" width=\"365\" height=\"295\" srcset=\"https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/student_ET-495x400.jpg 495w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/student_ET-845x684.jpg 845w\" sizes=\"auto, (max-width: 365px) 100vw, 365px\" \/><\/p>\n<p>This wearable ET device includes various components, such as illuminators, cameras, and a data collection and processing unit for image detection, 3D eye model, and gaze mapping algorithms. Compared to the screen-based ET device, the most significant differences of the wearable ET device are its binocular coverage, a field of view (FOV) and head tilt that has an impact on the glasses-configured eye-tracker.\u00a0 Also, it avoids potential experimental bias resulting from the display size or pixel dimensions of the screen. Similar to the screen-based ET, the images captured by the wearable ET camera are used to identify the glints on the cornea and the pupil. This information together with a 3D eye model is then used to estimate the gaze vector and gaze point for each participant.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone\" src=\"https:\/\/www.tobiipro.com\/imagevault\/publishedmedia\/1cr7b4clf3qahnqp031c\/How_DoesEyetrackingWork_GlassesPro2.jpg\" alt=\"\" width=\"1000\" height=\"667\" \/><\/p>\n<div class='avia-iframe-wrap'><iframe loading=\"lazy\" title=\"Eye_Tracking study at DAAP UC\" width=\"1500\" height=\"844\" src=\"https:\/\/www.youtube.com\/embed\/dfI1ZINgMm8?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/div>\n<p>After standard ET calibration and verification procedure, participants were instructed to walk in a defined space while wearing the glasses. In this case, the TOI was set at 60 seconds, recording a defined start and end events with the visual occurrences over that period. In this case, data was collected for both pre-conscious (first three seconds) and conscious viewing (after 3 seconds).<\/p>\n<p><a href=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-776\" src=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-1030x425.jpg\" alt=\"cafe2_h2\" width=\"1030\" height=\"425\" srcset=\"https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-1030x425.jpg 1030w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-300x124.jpg 300w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-1500x619.jpg 1500w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-705x291.jpg 705w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/cafe2_h2-450x186.jpg 450w\" sizes=\"auto, (max-width: 1030px) 100vw, 1030px\" \/><\/a><\/p>\n<p><a href=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-large wp-image-779\" src=\"http:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-1030x425.jpg\" alt=\"DAAP_cafe2_gz\" width=\"1030\" height=\"425\" srcset=\"https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-1030x425.jpg 1030w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-300x124.jpg 300w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-1500x619.jpg 1500w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-705x291.jpg 705w, https:\/\/ming3d.com\/VR\/wp-content\/uploads\/2019\/12\/DAAP_cafe2_gz-450x186.jpg 450w\" sizes=\"auto, (max-width: 1030px) 100vw, 1030px\" \/><\/a><\/p>\n<p>we also did the screen-based eye-tracking and compared the results.<\/p>\n<h2>2. Screen-based Eye-tracking technology<\/h2>\n<p>This method was beneficial for informing reviewers how an existing place or a proposed design was performing in terms of user experience. Moreover, while the fundamental visual elements that attract human attention and trigger conscious viewing are well-established and sometimes incorporated into signage design and placement, signs face an additional challenge because they must compete for viewers\u2019 visual attention in the context of the visual elements of the surrounding built and natural environments. As such, tools and methods are needed that can assist \u201ccontextually-sensitive\u201d design and placement by assessing how signs in situ capture the attention of their intended viewers.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone\" src=\"https:\/\/www.tobiipro.com\/imagevault\/publishedmedia\/aetq5hdbo7al15g49x34\/How_DoesEyetrackingWork_ScreenBased.jpg\" alt=\"\" width=\"1000\" height=\"667\" \/><\/p>\n<div class='avia-iframe-wrap'><iframe loading=\"lazy\" title=\"screen-based Eye-tracking\" width=\"1500\" height=\"844\" src=\"https:\/\/www.youtube.com\/embed\/Y5zykCi1ZMk?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture\" allowfullscreen><\/iframe><\/div>\n<h2>3. VR-Based Eye-tracking technology<\/h2>\n<p>Eye-tracking technology enables new forms of interactions in VR, with benefits to hardware manufacturers, software developers, end users and research professionals.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone\" src=\"https:\/\/www.tobiipro.com\/imagevault\/publishedmedia\/w0hpjcq7q69cj6ekvfzf\/TobiiPro-VR-Analytics-for-pre-production-design-te.jpg\" alt=\"\" width=\"1000\" height=\"500\" \/><\/p>\n<h2>Paper:<\/h2>\n<p><b>Tang. M<\/b>. <a href=\"http:\/\/ming3d.com\/upload\/paper\/Signage_2020.pdf\"><i>Analysis of Signage using Eye-Tracking Technology<\/i><\/a><i>.<\/i> <a href=\"https:\/\/journals.shareok.org\/ijsw\/issue\/view\/7\">Interdisciplinary Journal of Signage and Wayfinding<\/a>. 02. 2020.<\/p>\n<p><span class=\"c6 c8\">Tang, M<\/span><span class=\"c6\">. and Auffrey, C. \u201c<\/span><span class=\"c3\"><a class=\"c2\" href=\"https:\/\/www.google.com\/url?q=http:\/\/ming3d.com\/upload\/paper\/Urban_Rail_Transit_2018_Tang.pdf&amp;sa=D&amp;ust=1580233510528000\">Advanced Digital Tools for Updating Overcrowded Rail Stations: Using Eye Tracking, Virtual Reality, and Crowd Simulation to Support Design Decision-Making.<\/a><\/span><span class=\"c6\">\u201d <\/span><span class=\"c3\"><a class=\"c2\" href=\"https:\/\/www.google.com\/url?q=https:\/\/link.springer.com\/journal\/40864&amp;sa=D&amp;ust=1580233510528000\">Urban Rail Transit<\/a><\/span><span class=\"c0\">, December 19, 2018.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>1. Wearable Eye-tracking technology. This wearable ET device includes various components, such as illuminators, cameras, and a data collection and processing unit for image detection, 3D eye model, and gaze mapping algorithms. Compared to the screen-based ET device, the most significant differences of the wearable ET device are its binocular coverage, a field of view [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":795,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[29],"class_list":["post-774","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-eye-tracking"],"_links":{"self":[{"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/posts\/774","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/comments?post=774"}],"version-history":[{"count":11,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/posts\/774\/revisions"}],"predecessor-version":[{"id":798,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/posts\/774\/revisions\/798"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/media\/795"}],"wp:attachment":[{"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/media?parent=774"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/categories?post=774"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ming3d.com\/VR\/wp-json\/wp\/v2\/tags?post=774"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}