Welcome Customer !

Membership

Help

Beijing Jinfa Technology Co., Ltd
Custom manufacturer

Main Products:

hbzhan>Products

Beijing Jinfa Technology Co., Ltd

  • E-mail

    sales@kingfar.cn

  • Phone

    13021282218

  • Address

    1st and 3rd floors of Building 1, No. 12 Qinghe Anningzhuang Back Street, Haidian District, Beijing

Contact Now

ErgoLAB Eye Movement Analysis Module

NegotiableUpdate on 01/30
Model
Nature of the Manufacturer
Producers
Product Category
Place of Origin
Overview
The ErgoLAB eye tracking analysis module can be combined with various types of eye tracking hardware, including desktop eye tracking, telemetry eye tracking, wearable eye tracking, and virtual eye tracking systems, to process, analyze, and comprehensively analyze eye tracking data. It is a professional eye tracking data analysis software suitable for multi field research. Through eye tracking data, individuals' attention allocation, emotional arousal, fatigue, and load states can be explored, understanding the visual ways in which people perceive things around them and the factors that drive them to make decisions. Natural reactions that are not influenced by individual consciousness can be captured, and these natural reactions can be recorded and fed back in real time to complete the interface
Product Details

ErgoLAB Eye Movement Analysis Module

1、 Product Introduction

ErgoLAB Eye Movement Analysis ModuleIt can be combined with various types of eye tracking hardware, including desktop eye tracking, telemetry eye tracking, wearable eye tracking, and virtual eye tracking systems, to process, analyze, and comprehensively analyze eye tracking data. It is a professional eye tracking data analysis software suitable for multi field research. Through eye tracking data, individuals' attention allocation, emotional arousal, fatigue, and stress states can be explored, understanding the visual ways in which people perceive things around them and the factors that drive them to make decisions. Natural reactions that are not influenced by individual consciousness can be captured and recorded in real-time for feedback, completing human factors assessments of interface, product, cockpit, building, and environment designs.

ErgoLAB Eye Movement Analysis ModuleIntegrated into ErgoLAB human-machine environment synchronization cloud platform. The platform adopts independently developed technology by Jinfa Technology and has multiple invention and software copyrights. It has passed provincial and ministerial level new technology and product certifications, as well as international management system certifications such as European CE, American FCC, EU Rohs, ISO9001, ISO14001, OHSAS18001, etc. The platform is widely used in scientific research in multiple fields, including human-computer efficiency evaluation, human factors intelligence and human-computer interaction, industrial design and product usability testing, driving behavior and traffic safety, industrial management and human factors engineering, safety science and emergency behavior research, environmental behavior and architectural design, product and advertising marketing, management and behavior observation.

2、 Product Features

1. Standardized eye movement data processing

The system adopts the I-VT fixation point extraction algorithm, which has built-in standardized processing parameters and provides customization options for processing parameters to accurately extract the user's fixation point, as well as eye movement behaviors such as blinking and jumping.

2. Eye tracking visualization analysis

Support visual analysis methods such as eye movement hotspot maps, trajectory maps, and 3D maps based on various stimulating materials such as images, videos, and 3D virtual scenes, which intuitively reflect an individual's eye movement and gaze situation.

3. Statistical analysis of eye movement area of interest (AOI)

Eye tracking AOI data statistics and analysis can be conducted based on images, videos, 3D virtual scenes, etc. Statistical indicators include the number of fixations before entering time, the eye movement point of the first fixation, fixation time, total visit time, average visit time, visit frequency, total fixation time, average fixation time, fixation frequency, average fixation frequency, second fixation time, nearest neighboring fixation point, etc.

4. Eye movement interaction path extraction

Based on the platform's embedded sequence analysis algorithm, visual sequence analysis of gaze order in different regions of interest can be performed. On the one hand, it can verify whether the product or interface layout and element design conform to the individual's eye movement interaction sequence rules. On the other hand, by extracting typical eye movement interaction paths, it can provide a basis for product design and optimization.

5. Analysis of eye movement trajectory similarity

Analyzing the similarity of AOI access sequences among different individuals and calculating similarity indicators can be applied to determine the consistency and compliance of operations between records, or to compare experiments between beginners and veterans.

6. Eye Movement Interaction Analysis Based on Product Prototype

Through the ErgoLAB human-machine environment synchronization cloud platform, researchers can directly interface with web prototypes, APP prototypes, VR prototypes, HMI prototypes, etc., automatically recognize prototype components as eye movement interest areas, break through the limitations of screen recording methods in eye movement testing, fully preserve eye movement data, and directly link eye movement analysis results with prototype code, facilitating product optimization for designers.

7. Eye movement and comprehensive analysis of multidimensional data

Through the ErgoLAB human-machine environment synchronization cloud platform, comprehensive analysis of eye movement data and multidimensional data such as physiology, EEG, behavior, brain imaging, action posture, simulator, facial expression, etc. can be completed, such as multidimensional data cross analysis, spatiotemporal analysis, visualization analysis, etc., providing more complete data indicators for scientific research.

3、 Research support and joint experiments

Based on laboratory construction, Jinfa Technology can provide scientific research support and joint experimental services, aiming to combine new technologies and methods with usability testing research to promote the output and implementation of field achievements; Combining industry university research cooperation models to promote the incubation of more new technologies and products in industries, fields, and specialties.

4、 Manufacturer Introduction

Beijing Jinfa Technology Co., Ltd. is a technology-based small and medium-sized enterprise with independent import and export operation rights; Our independently developed engineering and ergonomics related technologies, products, and services have won multiple provincial and ministerial level scientific and technological awards, inventions, software copyrights, and provincial and ministerial level new technology and new products (services); Passed multiple international and explosion-proof certifications, including European CE, American FCC, EU RoHS, ISO9001, ISO14001, OHSAS18001, etc. Jinfa Technology has established an academic research team, technical team, and R&D team specializing in human factors engineering. Through years of industry university research cooperation with research institutions and universities, it has accumulated multiple core technologies in the fields of human factors and ergonomics, as well as state recognition and human-machine efficiency evaluation techniques and research methods based on artificial intelligence algorithms such as machine learning. It is a domestic center for human factors engineering technology innovation! The laboratory planning and construction technical team of Jinfa Technology provides technical support and after-sales service, covering the entire lifecycle of laboratory construction from planning and layout to equipment training and guidance from multiple perspectives.