Enrique Hortal

Welcome to my online curriculum 

About me

Personal Info

Name: Enrique Hortal Quesada

Date of birth: 30 April 1985

Nationality: Spanish

Location: Maastricht, the Netherlands

In April 2016, I joined the Department of Data Science and Knowledge Engineering (renamed in 2022 as the Department of Advanced Computing Sciences, DACS), at the Maastricht University, the Netherlands, as post-doctoral researcher in the RAI (Robotics, Agents and Interactions) group. Since August 2019 I work as an assistant professor in the same department. My teaching activities are mainly related to Computer Science and Machine Learning. From the academic year 2019/2020 I am also the coordinator of the Bachelor thesis for the BSc on Data Science and Artificial Intelligence at the DACS department.

I have a background in brain signal processing and classification as a member of the Brain Machine Interface Systems Lab at Miguel Hernández University of Elche, Spain. In this group, I developed my first experience as a researcher, obtaining a PhD in Industrial and Telecommunications Technologies in February 2016.

My research interests include machine/deep learning, human-machine interaction, (bio)signal processing and analysis and affective computing. Until March 2019, I was working on MaTHiSiS project as a senior researcher in charge of coordination and research activities and co-supervising two PhD candidates. In the context of this European project (Horizon2020-ICT-20-2015), we developed an adaptive learning platform to enhance vocational training, workplace learning and mainstream education for individuals with or without learning disabilities. In parallel and currently, I am carrying out teaching responsibilities in several courses such as Computer Sciences and Machine Learning, the coordination/examination of BSc research projects as well as the supervision of several BSc and Master students’ thesis.

Thank you for visiting my profile!

¡Gracias por visitar mi perfil!

Experience

Maastricht University, the Netherlands

August 2019 – Present

Assistant professor at the Department of Advanced Computing Sciences.

—————————-

April 2016 – July 2019

Post-doctoral researcher at the Department of Advanced Computing Sciences .

Miguel Hernández University of Elche, Spain

September 2012 – December 2015

Scienctific Researcher at the Engineering and Automation Department and member of the Brain-Machine Interfaces Systems Lab.

Sistemas Avanzados Telecom Levante, Spain

 February 2007 – August 2012

Microcontroller programmer at the Computer Science Department

Miguel Hernández University of Elche, Spain

October 2006 – January 2007

Maintenance and management of laboratory of high frequency electronic devices at the Engineering Department

Education

Miguel Hernández University of Elche, Spain

PhD in Industrial and telecommunications technologies (Tecnologías Industriales y de Telecomunicación)

February 2016

Miguel Hernández University of Elche, Spain

Master Degree in Industrial and telecommunications technologies (Tecnologías Industriales y de Telecomunicación)

October 2012

Miguel Hernández University of Elche, Spain

Bachelor degree in Telecommunications engineering, specialized in electronic systems (Ingeniero técnico en telecomunicaciones, especializado en sistemas electrónicos)

February 2009

Projects

JOINclusion – Joint problem-solving strategy towards social inclusion of children with a migrant background

JOINclusion is an Erasmus+ project under the Key Action 2 – Cooperation partnerships in school education with a total budget of 291.240 €.

JOINclusion is intended to foster the social inclusion of ALL children primary school students (with particular attention to those with a migrant background) through the use of a collaborative mobile application. This tool, designed by psychologists specializing in the field, aims to develop empathy learning scenarios strengthening the impact of their use. The scenarios are designed to promote dialogue between participants and facilitate channels to express themselves, promoting integration. Both the game and its scenarios are designed based on schools needs, involving end-users since early stages of the project development.

Furthermore, the collaborative serious game is boosted by machine learning techniques conceived to enhance user experience and optimize its efficiency as a tool promoting prosociality through personalisation based on affect detection. The interactions with the collaborative serious game is also designed to encourage self-reflection and post-game interaction among the members of the targeted groups (children with and without a migrant background) while, for children at risk of exclusion, it promotes reassurance to encourage them to learn to act as equal members of their new community and find their place in society.

MaTHiSiS – Managing Affective-learning THrough Intelligent atoms and Smart InteractionS

MaTHiSiS (Managing Affective-learning THrough Intelligent atoms and Smart InteractionS) is a H2020 project under the topic ICT-20-2015 Technologies for better human learning and teaching with a total cost of €7.618.584.

The MaTHiSiS learning vision is to provide a product-system for vocational training and mainstream education for both individuals with an intellectual disablity and non-diagnosed ones. This product-system consists of an integrated platform, along with a set of re-usable learning components (educational material, digital educational artefacts etc.), which will respond to the needs of a future educational framework, as drawn by the call, and provide capabilities for: i) adaptive learning, ii) automatic feedback, iii) automatic assessment of learner’s progress and behavioural state, iv) affective learning and v) game-based learning.

The MaTHiSiS consortium is coordinated by Atos Spain and consists of 18 beneficiary organizations from 9 different Member States collaborating, namely Spain, France, Greece, UK, Netherlands, Belgium, Italy, Lithuania and Germany.

BioMot – Smart wearable robots with biospired sensory-motor skills

The main objective of the project is to improve existing wearable robotic exoskeletons exploiting dynamic sensory-motor interactions and developing cognitive capabilities that can lead to symbiotic gait behavior in the interaction of a human with a wearable robot.

BioMot proposes a cognitive architecture for WRs exploiting neuronal control and learning mechanisms which main goal is to enable positive co-adaptation and seamless interaction with humans.

BioMot consortium brings together eight (8) partners from five different countries (Spain, Belgium, Italy, Iceland and Japan) on the basis of the multidisciplinary expertise and trans-nationality required for meeting its ambitious objectives and ensuring proper exploitation of results, both scientifically and clinically/commercially.

Brain2Motion – Exoskeletal-neuroprosthesis hybrid robotic system for the upper limb controlled by a multimodal brain-neural interface

Exoskeletal robots (ERs) are person-oriented robots that supplement the function of a limb or replace it completely. A possible alternative to ERs are Motor Neuro-Prostheses (MNP) based on Functional Electrical Stimulation (FES). Both ERs and MNPs are technologies that seek to restore or substitute motor function. MNPs constitute an approach to restoring function by means of artificially controlling human muscles or muscle nerves with FES. ERs use volitional commands for controlling the application of controlled forces to drive paralyzed or weak limbs.

The main goal of BRAIN2MOTION project is to develop a new hybrid ER-MNP for the upper limb interfaced to the users by means of non-invasive multimodal brain-neural computer interfaces (BNCIs). The robotic hybrid system will combine a light and kinematically compatible ER, and a textile-based surface MNP. In this combined ER-MNP, hardware and control strategies will be developed to combine the action of the ER and MNP while preserving motor latent capabilities of the user. A spontaneous non- invasive EEG-based Brain-Computer Interface (BCI) and an electrooculography (EOG) interface will compose the multimodal BNCI. The BCI will differentiate more than three mental tasks. This will be achieved incorporating new adaptive classifiers into the BCI. Learning strategies will be developed in order to improve the performance and versatility of the BCI. Control strategies combining EEG and EOG signals will be developed to control the ER-MNP.

The hybrid ER-MNP controlled by the BNCI will be used to perform reaching and grasping operations. The system will be validated with patients suffering from neurological conditions leading to severe motor disorders, in particular cerebrovascular accident (CVA).

Publications

Books:

  • E. Hortal, “Brain-Machine Interfaces for Assistance and Rehabilitation of People with Reduced Mobility”, Springer Theses Recognizing Outstanding Ph.D. Research 2018. ISBN: 9783319957050. dx.doi.org/10.1007/978-3-319-95705-0.

Book chapters:

  • D. Delisle-Rodríguez, T. Freire, Á. Costa, E. Hortal, J.C. Moreno, J. C. Alcázar, G. Herrera, S. Casco, A. del-Ama, “Interfaces hombre-máquina”. Book: Exoesqueletos Robóticos para Rehabilitación y Asistencia de Pacientes con Daño Neurológico: Experiencias y Posibilidades en Iberoamérica, ISBN: 13: 978-84-15413-29-5.
  • E. Hortal, J.C. Moreno, E. Rocón, “Experiencias clínicas con exoesqueletos”. Book: Exoesqueletos Robóticos para Rehabilitación y Asistencia de Pacientes con Daño Neurológico: Experiencias y Posibilidades en Iberoamérica, ISBN: 13: 978-84-15413-29-5.
  • E. Hortal, E. Iáñez, A. Úbeda, J.M. Azorín, “Brain-Machine Interfaces for Assistive Robotics”. Book: Intelligent Assistive Robots – Recent Advances in Assistive Robotics for Everyday Activities, Springer Tracts in Advanced Robotics, Vol. 106, 77-102 (26 pages). Editors: Mohammed, S., Moreno, J.C., Kong, K., Amirat, Y., Springer International Publishing Switzerland 2015, 14 February 2015, XII, 480 p. ISBN 978-3-319-12921-1. dx.doi.org/10.1007/978-3-319-12922-8_3
  • E. Hortal, D. Planelles, E. Iáñez, A. Costa, A. Úbeda, J.M. Azorín, “Detection of Gait Initiation Through a ERD-Based Brain-Computer Interface”. Book: Advances in Neurotechnology, Electronics and Informatics, 141-150 (10 pages). Springer International Publishing Switzerland, 11 December 2015. ISBN 978-3-319-26240-6. dx.doi.org/10.1007/978-3-319-26242-0_10

Journal papers: 

  • E. Hortal, R.B. Alarcia (2021). “GANtron: Emotional Speech Synthesis with Generative Adversarial Networks”. arXiv preprint arXiv:2110.03390.
  • P.J. Standen, D. Brown, M. Taheri, Mohammad, M.J. Galvez Trigo, H. Boulton, A. Burton, M. Hallewell, J. Lathe, N. Shopland, M. Gonzalez, G. Kwiatkowska, E. Milli, S. Cobello, A. Mazzucato, M. Traversi, E. Hortal, “An evaluation of an adaptive learning system based on multimodal affect recognition for learners with intellectual disabilities”, British Journal of Educational Technology, 2020. dx.doi.org/10.1111/bjet.13010. IF: 2.951 (Q1).
  • C. Athanasiadis, E. Hortal, S. Asteriadis, “Audio–visual domain adaptation using conditional semi-supervised Generative Adversarial Networks”, Neurocomputing, 2020. doi.org/10.1016/j.neucom.2019.09.106. IF: 4.072 (Q1)
  • C. Athanasiadis, M. Amestoy, E. Hortal, S. Asteriadis, “e-3 learning: a Dataset for Affect-driven Adaptation of Computer-Based Learning”, IEEE MultiMedia, 27(1), 2019. IF: 3.556 (Q1). Available here.
  • N. Vretos, P. Daras, S. Asteriadis, E. Hortal, E. Ghaleb, et al. “Exploiting sensing devices availability in AR/VR deployments to foster engagement”, Virtual Reality, 2018. dx.doi.org/10.1007/s10055-018-0357-0. IF: 1.375 (Q2)
  • E. Hortal, A. Úbeda, E. Iáñez, J. M. Azorín, E. Fernández, “EEG-Based Detection of Starting and Stopping During Gait Cycle”, International Journal of Neural Systems, 26(7), 2016. dx.doi.org/10.1142/S0129065716500295. IF: 6.085 (Q1)
  • Á. Costa, E. Iáñez, A. Úbeda, E. Hortal, A.J. Del-Ama, A. Gil-Agudo, J.M. Azorín, “Decoding the attentional demands of gait through EEG gamma band features”, PloS one, 11(4), e0154136, 2016. dx.doi.org/10.1371/journal.pone.0154136. IF: 3.234 (Q1)
  • R. Salazar-Varas, Á. Costa, E. Iáñez, A. Úbeda, E. Hortal, J. M. Azorín, “Analyzing EEG signals to detect unexpected obstacles during walking”, Journal of NeuroEngineering and Rehabilitation, 12(101), 2015. dx.doi.org/10.1186/s12984-015-0095-4. IF: 2.740 (Q1)
  • E. Hortal, D. Planelles, F. Resquin, J. M. Climent, J. M. Azorín, J. L. Pons, “Using a brain-machine interface to control a hybrid upper limb exoskeleton during rehabilitation of patients with neurological conditions”, Journal of NeuroEngineering and Rehabilitation, 12(92), 2015. dx.doi.org/10.1186/s12984-015-0082-9. IF: 2.740 (Q1)
  • E. Hortal, E. Iáñez, A. Úbeda, C. Perez-Vidal, J.M. Azorín, “Combining a Brain–Machine Interface and an Electrooculography Interface to perform pick and place tasks with a robotic arm”, Robotics and Autonomous Systems, 72, 181-188, October 2015. dx.doi.org/10.1016/j.robot.2015.05.010. IF: 1.256 (Q2)
  • E. Hortal, D. Planelles, A. Costa, E. Iáñez, A. Úbeda, J.M. Azorín, E. Fernández, “SVM-based Brain-Machine Interface for controlling a robot arm through four mental tasks”, Neurocomputing, 151 (1), 116-121, March 2015. dx.doi.org/10.1016/j.neucom.2014.09.078. IF: 2.083 (Q2)
  • A. Úbeda, E. Hortal, E. Iáñez, C. Perez-Vidal, J.M. Azorín, “Assessing Movement Factors in Upper Limb Kinematics Decoding from EEG Signals”, PLoS ONE 10(5), 2015. dx.doi.org/10.1371/journal.pone.0128456. IF: 3.234 (Q1)
  • Á. Costa, E. Hortal, E. Iáñez, J. M. Azorín, “A Supplementary System for a Brain-Machine Interface Based on Jaw Artifacts for the Bidimensional Control of a Robotic Arm”, PLoS ONE, 9(11): e112352, November 2014. dx.doi.org/10.1371/journal.pone.0112352. IF: 3.234 (Q1)
  • D. Planelles, E. Hortal, A. Costa, A. Úbeda, E. Iáñez, J.M. Azorín, “Evaluating Classifiers to Detect Arm Movement Intention from EEG Signals”, Sensors, 14: 18172-18186, September 2014. dx.doi.org/10.3390/s141018172. IF: 2.245 (Q1)
  • E. Hortal, A. Úbeda, E. Iáñez and J.M. Azorín, “Control of a 2 DoF Robot Using a Brain-Machine Interface”, Computer Methods and Programs in Biomedicine, New methods of human-robot interaction in medical practice, 116(2): 169-176, September 2014. dx.doi.org/10.1016/j.cmpb.2014.02.018. IF: 1.897 (Q1)

National conferences:

  • E. Hortal, M. Rodríguez-Ugarte, J. Ibáñez, J. L. Pons, J. M. Azorín”Diseño preliminar de una plataforma experimental basada en neuroestimulación para el análisis de la interacción corticomuscular”. Actas del XXXIII Congreso Anual de la Sociedad Española de Ingeniería Biomédica. (CASEIB). Madrid, España, pages: 451-454. ISBN: 978-84-608-3354-3. 4-6November 2015
  • A. Belda, E. Hortal, J. M. Azorín. “Control de un robot humanoide mediante el uso de una interfaz cerebro-computador”.  Actas de las XXXVI Jornadas de Automática, Bilbao, Spain, pages: 773-778. 2 – 4 September, 2015
  • E. Iáñez, A. Úbeda, E. Hortal, Á. Costa, J.M. Azorín. “Decodificación del ángulo de rodilla a partir de señales EEG”. Cognitive Area Networks, Proceedings of 7º Simposio CEA Bioingeniería, Interfaces Cerebro-Computador (BCI) y Tecnologías Asistenciales. Málaga, Spain. Vol. 2, pages: 45-50. ISSN: 2341-4243, 25-26 June, 2015
  • I. Ríos, E. Hortal, J. A. Flores, J. Gimeno, J. M. Azorín. “Estudio de la aplicación de estimulación eléctrica funcional para la mejora del funcionamiento de interfaces cerebro-computadora”. Actas de las XXXV Jornadas de Automática, Valencia, Spain, pages: 166-171, ISBN-13: 978-84-697-0589-6, 3-5 September 2014
  • A. León, E. Hortal, A. Rodríguez, J. M. Climent, J. M. Cano, José M. Azorín. “Implementación de una librería en Simulink para el desarrollo de interfaces cerebro-computador”. Actas de las XXXV Jornadas de Automática, Valencia, Spain, pages: 236-242, ISBN-13: 978-84-697-0589-6, 3-5 September 2014
  • E. Hortal, D. Planelles, A. Úbeda, A.D. Koutsou, F. Resquín, J.M. Azorín and J.L. Pons. “Arquitectura de una Interfaz Cerebro-Máquina para el control de un exoesqueleto robot de miembro superior”. Cognitive Area Networks, proceedings of 6º Simposio CEA Bioingeniería, Granada, Spain. Vol. 1, pages: 13-18. ISSN: 2341-4243. 12-13 June, 2014
  • A. Costa, E. Iáñez, E. Hortal, J. M. Azorín, A. Rodríguez, D. Tornero, J.A Berná, J.M. Cano. “Movimiento bidimensional de un cursor mediante el uso de artefactos en señales electroencefalográficas”. Actas de las XXXIV Jornadas de Automática. Terrassa, España, pages: 108-114. ISBN: 978-84-616-5063-7, 4-6 September 2013
  • E. Hortal, A. Úbeda, E. Iáñez, A. Rodríguez, J.M. Azorín. “Detección de la intención de movimiento del brazo mediante señales EEG”. 4º Simposio CEA Bioingeniería 2012 – Redes REDINBIO y RETADIM – BCI (Brain Computer Interface) y Tecnologías de la Rehabilitación – Libro de Actas. Valladolid, España, pages: 35-41. ISBN: 978-84-695-3541-7, 29 May 2012

See more in the following profiles:

International conferences:

  • J. García-Fernández, E. Hortal, and S. Mehrkanoon, “Towards biologically plausible learning in neural networks”. In 2021 IEEE Symposium Series on Computational Intelligence (SSCI) (pp. 01-08). IEEE. 10.1109/SSCI50451.2021.9659539 Available here.
  • C. Athanasiadis, E. Hortal, and S. Asteriadis, “Temporal conditional Wasserstein GANs for audio-visual affect-related ties”. In 2021 9th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). p. 1-8 8 p. Available here.
  • S. Bhat and E. Hortal, “GAN-Based Data Augmentation For Improving The Classification Of EEG Signals”. In The 14th PErvasive Technologies Related to Assistive Environments Conference (PETRA 2021). Association for Computing Machinery, New York, NY, USA, 453–458. 2021. DOI: doi.org/10.1145/3453892.3461338. Available here.
  • C. Athanasiadis, E. Hortal and S. Asteriadis, “Audio-Based Emotion Recognition Enhancement Through Progressive Gans”. In 2020 IEEE International Conference on Image Processing (ICIP) (pp. 236-240). IEEE, October 2020. Available here.
  • C. Athanasiadis, E. Hortal and S. Asteriadis, “Bridging face and sound modalities through Domain Adaptation Metric Learning”. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2019), Bruges, Belgium, 24-26 April 2019. Available here.
  • E. Ghaleb, M. Popa, E. Hortal, S. Asteriadis and G. Weiss, “Towards Affect Recognition through Interactions with Learning Materials”. In 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, USA, 17-20 December 2018. Available here.
  • D. Tsatsou, A. Pomazanskyi, A. Pomazanskyi, E. Hortal, et al., “Adaptive Learning Based on Affect Sensing”. International Conference on Artificial Intelligence in Education (AIED 2018), Lecture Notes in Computer Science, vol 10948, London, United Kingdom, 2018. Available here.
  • E. Ghaleb, M. Popa, E. HortalS. Asteriadis, “Multimodal Fusion Based on Information Gain for Emotion Recognition in the Wild”. Intelligent Systems Conference (IntelliSys 2017), London, United Kingdom, 7-8 September 2017. Available here.
  • J. Schwan, E. Ghaleb, E. Hortal, S. Asteriadis, “High-performance and Lightweight Real-time Deep Face Emotion Recognition”. SMAP2017 – 12th International Workshop on Semantic and Social Media, Bratislava, Slovakia, 9-10 July, 2017. Available here.
  • C. Athanasiadis, C.Z. Lens, D. Koutsoukos, E. Hortal, S. Asteriadis, “Personalized, affect and performance-driven Computer-based Learning“. 9th International Conference on Computer Supported Education (CSEDU 2017), Porto, Portugal, 21-23 April 2017
  • M. Rodríguez-Ugarte, E. Hortal, Á. Costa, E. Iáñez, A. Úbeda, J.M. Azorín, 2016, “Detection of intention of pedaling start cycle through EEG signals”. 38th Annual International Conference of the International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2016), 1496-1499. ISBN: 978-1-4577-0220-4. 2016 10.1109/EMBC.2016.7590993
  • Á. Costa, E. Iáñez, E. Hortal, A. Úbeda, M. Rodriguez-Uguarte, J.M. Azorín, “Evaluation of Motion Artifacts on EEG Signals during Exoskeleton Gait”. Proceedings of the Sixth International Brain-Computer Interface Meeting: BCI Past, Present, and Future. Asilomar Conference Center, Pacific Grove, California, USA, 99-99. ISBN: 978-3-85125-467-9. June 2016.  dx.doi.org/10.3217/978-3-85125-467-9
  • E. Iáñez, Á. Costa, E. Hortal, A. Úbeda, M. Rodriguez-Uguarte, J.M. Azorín, “New approach based on frequency features of EEG signals when obstacles suddenly appear during walking”. Proceedings of the Sixth International Brain-Computer Interface Meeting: BCI Past, Present, and Future. Asilomar Conference Center, Pacific Grove, California, USA, 188-188. ISBN: 978-3-85125-467-9. June 2016. dx.doi.org/10.3217/978-3-85125-467-9
  • E. Iáñez, Á. Costa, A. Úbeda, E. HortalM. Rodríguez-Ugarte and J.M. Azorín, “Evaluating Cognitive Mechanisms During Walking from EEG Signals”. Converging Clinical and Engineering Research on Neurorehabilitation II. Proceedings of the 3rd International Conference on NeuroRehabilitation (ICNR2016), October 18–21, 2016, Segovia, Spain, 1463-1467. ISBN 978-3-319-46668-2 ISBN 978-3-319-46669-9 (eBook). dx.doi.org/10.1007/978-3-319-46669-9
  • Á. Costa, R. Salazar-Varas, E. Iáñez, A. Úbeda, E. Hortal, J.M. Azorín. “Studying cognitive attention mechanisms during walking from EEG signals”. IEEE International Conference on Systems, Man, and Cybernetics (SMC 2015), Special Session on “Robotic Exoskeletons with Bioinspired Skills”, Hong Kong, 9-12 Oct 2015. ISBN: 978-1-4799-8697-2 10.1109/SMC.2015.162
  • E. Hortal, E. Márquez-Sáchez, A. Costa, E. Piñuela-Martín, R. Salazar, A.J. del-Ama, A. Gil-Agudo, J.M. Azorín. “Starting and finishing gait detection using a BMI for spinal cord injury rehabilitation”. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015), Innovative Session on “Wearable Robotics for Motion Assistance and Rehabilitation”, Hamburg, Germany, pages: 6184-6189. September 28 – October 03, 2015
  • E. Iáñez, A. Úbeda, Á. Costa, E. Hortal, J.M. Azorín. “Decoding of knee angles through EEG using active electrodes”. 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2015), Milan, Italy, 1 page. 25 – 29 August, 2015
  • J. Castillo-Garcia, E. Caicedo, T. Bastos, E. Hortal, E. Iáñez, J.M. Azorín. “Active Learning for Adaptive Brain Machine Interface Based on Software Agent”. 23rd Mediterranean Conference on Control and Automation (MED 2015), Torremolinos, Spainm pages: 56-60. ISBN: 978-1-4799-9935-4. 16-19 June 2015
  • E. Hortal, A. Úbeda, E. Iáñez, E. Fernández, J.M. Azorin. “Using EEG signals to detect the intention of walking initiation and stopping”. Artificial Computation in Biology and Medicine, 6th. International Work-Conference on the Interplay between Natural and Artificial Computation (IWINAC 2015), Proceedings, Part I, Lecture Notes in Computer Science (LCNS) 9107, Elche, Spain, Vol. 9107, pages: 278-287. ISBN: 978-3-319-18913-0, ISSN: 0302-9743, 1-5 June 2015. dx.doi.org/10.1007/978-3-319-18914-7_29
  • J.M. Azorín, A. Úbeda, D. Planelles, A. Costa, E. Hortal, E. Iáñez. “Analysis of electrode configurations for the decoding of knee angles from EEG signals during gait”. Neuroscience 2014, Washington DC, EEUU, 15-19 Noviembre 2014
  • D. Planelles, E. Iáñez, E. Hortal, A. Úbeda, A. Costa and J.M. Azorín. “Preliminary study to detect gait initiation intention through a BCI system”. In Proceedings of the 2nd International Congress on Neurotechnology, Electronics and Informatics, NEUROTECHNIX, Rome, Italy, pages: 61-66. 25-26 October 2014. DOI: 10.5220/0005167800610066
  • E. Hortal, E. Iáñez, A. Úbeda, D. Planelles, A. Costa and J.M. Azorín. “Selection of the best mental tasks for a SVM-based BCI system”. IEEE International Conference on Systems, Man, and Cybernetics, San Diego, USA, pages: 1502-1507, ISBN: 978-1-4799-3839-1, 5-8 October, 2014
  • A. Úbeda, D. Planelles, A. Costa, E. Hortal, E. Iáñez and J.M. Azorín. “Decoding knee angles from EEG signals for different walking speeds”. IEEE International Conference on Systems, Man, and Cybernetics, San Diego, USA, pages: 1494-1497, ISBN: 978-1-4799-3839-1, 5-8 October, 2014
  • A. Costa, E. Iáñez, A. Úbeda, Daniel Planelles, E. Hortal, J.M. Azorín. “Frequency and Number of Neighbors Study for Attention Level Classification Using EEG Signals”. International Workshop on Wearable Robotics (WeRob 2014), Baiona, Spain, 14-19 September 2014
  • A. Costa, E. Iáñez, A. Úbeda, D. Planelles, E. Hortal, J.M. Azorín. “Experimental Setup and First Results of a BCI System for Attention Levels Classification During Gait”. NeuroRob-2014, Padova, Italia, 18 July 2014
  • A. Costa, E. Hortal, A. Úbeda, E. Iáñez, J.M. Azorín. “Reducing the False Positives Rate in a BCI System to Detect Error-Related EEG Potentials”. Replace, Repair, Restore, Relieve – Bridging Clinical and Engineering Solutions in Neurorehabilitation, Proceedings of  the 2nd International Conference on NeuroRehabilitation (ICNR 2014), Aalborg, Denmark. Vol. 7, pages: 321-327. ISSN: 2195-3562. ISBN: 978-3-319-08071-0. 24-26 June 2014
  • D. Planelles, E. Hortal, E. Iáñez, A. Costa and J.M. Azorín. “Processing EEG Signals to Detect Intention of Upper Limb Movement”. Replace, Repair, Restore, Relieve – Bridging Clinical and Engineering Solutions in Neurorehabilitation, Proceedings of  the 2nd International Conference on NeuroRehabilitation, Aalborg, Denmark. Vol. 7, pages: 655-664. ISSN: 2195-3562. ISBN: 978-3-319-08071-0. 24-26 June 2014
  • A. Úbeda, D. Planelles, E. Hortal, F. Resquín, A.D. Koutsou, J.M. Azorín and J.L. Pons. “A Brain-Machine Interface Architecture to Control an Upper Limb Rehabilitation Exoskeleton”. Replace, Repair, Restore, Relieve – Bridging Clinical and Engineering Solutions in Neurorehabilitation, Proceedings of  the 2nd International Conference on NeuroRehabilitation, Aalborg, Denmark. Vol. 7, pages: 795-804. ISSN: 2195-3562. ISBN: 978-3-319-08071-0. 24-26 June 2014
  • D. Planelles, E. Hortal, A. Costa, E. Iáñez and J.M. Azorín, “First steps in the development of an EEG-based system to detect intention of gait initiation”, 8th Annual IEEE International Systems Conference, Ottawa, Canada. Pages: 167-171. ISBN: 978-1-4799-2087-7. 31 March – 3 April 2014
  • E. Hortal, D. Planelles, A. Úbeda,  A. Costa and J.M. Azorín, “Brain-Machine Interface System to Differentiate between Five Mental Tasks”, 8th Annual IEEE International Systems Conference, Ottawa, Canada. Pages: 172-175. ISBN: 978-1-4799-2087-7. 31 March – 3 April 2014
  • E. Hortal, E. Iáñez, A. Úbeda, D. Planelles, J.M. Azorín. “Comparativa de clasificadores para diferenciación de 4 estados mentales utilizando señales EEG”. Libro de Actas VII Congreso Iberoamericano de Tecnologías de Apoyo a la Discapacidad, IBERDISCAP 2013. Santo Domingo, República Dominicana, pages: 42-47. ISBN: 978-9945-00-959-9, 28-29 November 2013
  • D. Jiménez, A. Úbeda, E. Iáñez, E. Hortal, D. Planelles, J.M. Azorín. “Detección de la voluntad de movimiento del brazo basada en la desincronización relacionada a eventos”. Libro de Actas VII Congreso Iberoamericano de Tecnologías de Apoyo a la Discapacidad, IBERDISCAP 2013. Santo Domingo, República Dominicana, pages: 251-255. ISBN: 978-9945-00-959-9, 28-29 November 2013
  • A. Úbeda, E. Hortal, E. Iáñez, D. Planelles, J. M. Azorín. “Passive robot assistance in arm movement decoding from EEG signals”. 6th Annual International IEEE EMBS Conference on Neural Engineering, San Diego, California. Pages: 895-898. ISBN: 978-1-4673-1969-0. 6-8 November 2013
  • E. Hortal, A. Úbeda, E. Iáñez, D. Planelles, J. M. Azorín. “Online classification of two mental tasks using a SVM-based BCI system”. 6th Annual International IEEE EMBS Conference on Neural Engineering, San Diego, California. Pages: 1307-1310. ISBN: 978-1-4673-1969-0. 6-8 November 2013
  • E. Iáñez, A. Úbeda, E. Hortal, J. M. Azorín, E. Fernández. “Empirical Analysis of the Integration of a BCI and an EOG Interface to Control a Robot Arm”. Natural and Artificial Models in Computation and Biology, 5th International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2013, Mallorca, Spain, Proceedings, Part I, Lecture Notes in Computer Science (LNCS) 7930. Pages: 151-160. ISSN: 0302-9743. ISBN: 978-3-642-38636-7. 10-14 June 2013. dx.doi.org/10.1007/978-3-642-38637-4
  • E. Hortal, E. Iáñez, A. Úbeda, J. M. Azorín, E. Fernández. “Training Study Approaches for a SVM-Based BCI: Adaptation to the Model vs Adaptation to the User”. Natural and Artificial Models in Computation and Biology, 5th International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2013, Mallorca, Spain, Proceedings, Part I, Lecture Notes in Computer Science (LNCS) 7930. Pages: 131-140. ISSN: 0302-9743. ISBN: 978-3-642-38636-7. 10-14 June 2013. dx.doi.org/10.1007/978-3-642-38637-4
  • E. Hortal, A. Úbeda, E. Iáñez, D. Planelles, J. M. Azorín. “Selection of the best classifier for differentiating mental tasks in a brain-machine interface”. 18th IFESS Annual Conference 2013 – Bridging Mind and Body (Donostia-San Sebastián, Spain). Pages: 231-234. ISBN: 978-86-7466-462-9. 5-8 June 2013
  • E. Iáñez, A. Úbeda, E. Hortal, J.M. Azorín. “Evaluation methodology for brain-controlled robots”. IEEE International Conference on Robotics and Automation (ICRA 2013), Workshops and Tutorials Proceedings. Karlsruhe, Germany, 6-10 May 2013
  • E. Iáñez, A. Úbeda, E. Hortal, J.M. Azorín. “Mental Tasks Selection Method for a SVM-Based BCI System”. Proceedings of the 7th Annual IEEE International Systems Conference (SYSCON 2013), Orlando, United States. Pages: 767-771. ISBN: 978-1-4673-3106-7. 15-18 Abril 2013
  • A. Úbeda, E. Iáñez, E. Hortal, J.M. Azorín. “Linear Decoding of 2D Hand Movements for Target Selection Tasks Using a Non-Invasive BCI System”. Proceedings of the 7th Annual IEEE International Systems Conference (SYSCON 2013), Orlando, United States. Pages: 778-782. ISBN: 978-1-4673-3106-7. 15-18 Abril 2013
  • E. Hortal, E. Iáñez, A. Úbeda, D. Tornero and J. M. Azorín. “Decoding upper limb movement velocity for stroke rehabilitation”. Proceedings of the International Conference on Neurorehabilitation (ICNR 2012), Converging Clinical and Engineering Research on Neurorehabilitation, Toledo, España. Pages: 491-494. ISBN: 978-3-642-34545-6. 14-16 November 2012

More of my work

[dslc_module_projects_output]a:342:{s:6:”amount”;s:1:”8″;s:10:”categories”;s:9:”agency-2 “;s:17:”separator_enabled”;s:8:”disabled”;s:32:”css_thumbnail_padding_horizontal”;s:1:”0″;s:25:”thumb_resize_width_manual”;s:3:”300″;s:23:”carousel_autoplay_hover”;s:4:”true”;s:18:”module_instance_id”;s:2:”87″;s:7:”post_id”;s:2:”33″;s:11:”dslc_m_size”;s:2:”12″;s:9:”module_id”;s:13:”DSLC_Projects”;s:16:”dslc_m_size_last”;s:3:”yes”;s:21:”module_render_nonajax”;b:1;s:22:”css_thumbnail_bg_color”;s:7:”inherit”;s:21:”css_title_color_hover”;s:7:”inherit”;s:14:”css_cats_color”;s:7:”inherit”;s:17:”css_excerpt_color”;s:7:”inherit”;s:23:”css_button_border_color”;s:7:”inherit”;s:29:”css_button_border_color_hover”;s:7:”inherit”;s:22:”css_main_heading_color”;s:7:”inherit”;s:27:”css_main_heading_link_color”;s:7:”inherit”;s:33:”css_main_heading_link_color_hover”;s:7:”inherit”;s:29:”css_arrows_border_color_hover”;s:7:”inherit”;s:16:”css_pag_bg_color”;s:7:”inherit”;s:20:”css_pag_border_color”;s:7:”inherit”;s:11:”css_show_on”;s:20:”desktop tablet phone”;s:10:”css_custom”;s:7:”enabled”;s:4:”link”;s:9:”permalink”;s:11:”link_target”;s:5:”_self”;s:4:”type”;s:4:”grid”;s:11:”orientation”;s:8:”vertical”;s:15:”pagination_type”;s:8:”disabled”;s:7:”columns”;s:1:”3″;s:19:”categories_operator”;s:2:”IN”;s:7:”orderby”;s:4:”date”;s:5:”order”;s:4:”DESC”;s:6:”offset”;s:1:”0″;s:11:”query_alter”;s:7:”enabled”;s:13:”post_elements”;s:26:”thumbnail categories title”;s:17:”carousel_elements”;s:8:”circles “;s:17:”css_margin_bottom”;s:1:”0″;s:14:”css_min_height”;s:1:”0″;s:20:”css_sep_border_color”;s:18:”rgb(255, 255, 255)”;s:14:”css_sep_height”;s:2:”15″;s:17:”css_sep_thickness”;s:1:”0″;s:13:”css_sep_style”;s:4:”none”;s:9:”link_type”;s:8:”url_same”;s:15:”css_thumb_align”;s:6:”center”;s:22:”css_thumb_border_color”;s:7:”#e6e6e6″;s:22:”css_thumb_border_width”;s:1:”0″;s:21:”css_thumb_border_trbl”;s:21:”top right bottom left”;s:31:”css_thumbnail_border_radius_top”;s:1:”0″;s:34:”css_thumbnail_border_radius_bottom”;s:1:”0″;s:27:”css_thumbnail_margin_bottom”;s:1:”0″;s:30:”css_thumbnail_padding_vertical”;s:1:”0″;s:11:”thumb_width”;s:3:”100″;s:13:”main_location”;s:6:”inside”;s:17:”css_main_bg_color”;s:24:”rgba(255, 255, 255, 0.8)”;s:21:”css_main_border_color”;s:11:”transparent”;s:21:”css_main_border_width”;s:1:”1″;s:20:”css_main_border_trbl”;s:22:”top right bottom left “;s:26:”css_main_border_radius_top”;s:1:”0″;s:29:”css_main_border_radius_bottom”;s:1:”4″;s:19:”css_main_min_height”;s:1:”0″;s:25:”css_main_padding_vertical”;s:2:”10″;s:27:”css_main_padding_horizontal”;s:2:”10″;s:19:”css_main_text_align”;s:6:”center”;s:13:”main_position”;s:6:”center”;s:21:”css_main_inner_margin”;s:1:”0″;s:20:”css_main_inner_width”;s:3:”100″;s:15:”css_title_color”;s:15:”rgb(62, 76, 83)”;s:19:”css_title_font_size”;s:2:”14″;s:21:”css_title_font_weight”;s:3:”300″;s:21:”css_title_font_family”;s:7:”Raleway”;s:21:”css_title_line_height”;s:2:”14″;s:24:”css_title_letter_spacing”;s:1:”0″;s:23:”css_title_margin_bottom”;s:2:”10″;s:24:”css_title_text_transform”;s:4:”none”;s:18:”css_cats_font_size”;s:2:”12″;s:20:”css_cats_font_weight”;s:3:”400″;s:20:”css_cats_font_family”;s:4:”Lato”;s:19:”css_cats_font_style”;s:6:”italic”;s:20:”css_cats_line_height”;s:2:”14″;s:22:”css_cats_margin-bottom”;s:1:”0″;s:15:”css_staff_color”;s:7:”#000000″;s:19:”css_staff_font_size”;s:2:”12″;s:21:”css_staff_font_weight”;s:3:”400″;s:20:”css_staff_font_style”;s:6:”normal”;s:21:”css_staff_line_height”;s:2:”12″;s:23:”css_staff_margin-bottom”;s:2:”10″;s:17:”css_partner_color”;s:7:”#000000″;s:21:”css_partner_font_size”;s:2:”12″;s:23:”css_partner_font_weight”;s:3:”400″;s:22:”css_partner_font_style”;s:6:”normal”;s:23:”css_partner_line_height”;s:2:”12″;s:25:”css_partner_margin-bottom”;s:2:”10″;s:18:”excerpt_or_content”;s:7:”excerpt”;s:24:”css_excerpt_border_color”;s:7:”#e6e6e6″;s:24:”css_excerpt_border_width”;s:1:”1″;s:24:”css_excerpt_border_style”;s:5:”solid”;s:21:”css_excerpt_font_size”;s:2:”13″;s:23:”css_excerpt_font_weight”;s:3:”400″;s:23:”css_excerpt_line_height”;s:2:”22″;s:14:”excerpt_margin”;s:2:”22″;s:14:”excerpt_length”;s:2:”20″;s:19:”css_excerpt_padding”;s:2:”15″;s:11:”button_text”;s:12:”VIEW PROJECT”;s:19:”css_button_bg_color”;s:7:”#5890e5″;s:25:”css_button_bg_color_hover”;s:7:”#477ccc”;s:23:”css_button_border_width”;s:1:”0″;s:24:”css_button_border_radius”;s:1:”3″;s:16:”css_button_color”;s:7:”#ffffff”;s:22:”css_button_color_hover”;s:7:”#ffffff”;s:20:”css_button_font_size”;s:2:”11″;s:22:”css_button_font_weight”;s:3:”800″;s:27:”css_button_padding_vertical”;s:2:”13″;s:29:”css_button_padding_horizontal”;s:2:”16″;s:21:”css_button_margin_top”;s:1:”0″;s:24:”css_button_margin_bottom”;s:1:”0″;s:9:”show_icon”;s:4:”font”;s:24:”css_button_icon_size_svg”;s:2:”11″;s:22:”css_button_icon_margin”;s:1:”5″;s:9:”css_res_t”;s:8:”disabled”;s:23:”css_res_t_margin_bottom”;s:1:”0″;s:20:”css_res_t_sep_height”;s:2:”32″;s:23:”css_res_t_sep_thickness”;s:1:”1″;s:33:”css_res_t_thumbnail_margin_bottom”;s:1:”0″;s:36:”css_res_t_thumbnail_padding_vertical”;s:1:”0″;s:38:”css_res_t_thumbnail_padding_horizontal”;s:1:”0″;s:31:”css_res_t_main_padding_vertical”;s:2:”25″;s:33:”css_res_t_main_padding_horizontal”;s:2:”22″;s:25:”css_res_t_title_font_size”;s:2:”12″;s:27:”css_res_t_title_line_height”;s:2:”12″;s:29:”css_res_t_title_margin_bottom”;s:2:”10″;s:24:”css_res_t_cats_font_size”;s:2:”10″;s:26:”css_res_t_cats_line_height”;s:2:”10″;s:28:”css_res_t_cats_margin-bottom”;s:1:”0″;s:25:”css_res_t_staff_font_size”;s:2:”12″;s:27:”css_res_t_staff_line_height”;s:2:”12″;s:29:”css_res_t_staff_margin-bottom”;s:2:”10″;s:27:”css_res_t_partner_font_size”;s:2:”12″;s:29:”css_res_t_partner_line_height”;s:2:”12″;s:31:”css_res_t_partner_margin-bottom”;s:2:”10″;s:27:”css_res_t_excerpt_font_size”;s:2:”13″;s:29:”css_res_t_excerpt_line_height”;s:2:”22″;s:24:”css_res_t_excerpt_margin”;s:2:”22″;s:26:”css_res_t_button_font_size”;s:2:”11″;s:33:”css_res_t_button_padding_vertical”;s:2:”13″;s:35:”css_res_t_button_padding_horizontal”;s:2:”16″;s:27:”css_res_t_button_margin_top”;s:1:”0″;s:30:”css_res_t_button_margin_bottom”;s:1:”0″;s:30:”css_res_t_button_icon_size_svg”;s:2:”11″;s:28:”css_res_t_button_icon_margin”;s:1:”5″;s:9:”css_res_p”;s:8:”disabled”;s:23:”css_res_p_margin_bottom”;s:1:”0″;s:20:”css_res_p_sep_height”;s:2:”32″;s:23:”css_res_p_sep_thickness”;s:1:”1″;s:33:”css_res_p_thumbnail_margin_bottom”;s:1:”0″;s:36:”css_res_p_thumbnail_padding_vertical”;s:1:”0″;s:38:”css_res_p_thumbnail_padding_horizontal”;s:1:”0″;s:31:”css_res_p_main_padding_vertical”;s:2:”25″;s:33:”css_res_p_main_padding_horizontal”;s:2:”22″;s:25:”css_res_p_title_font_size”;s:2:”12″;s:27:”css_res_p_title_line_height”;s:2:”12″;s:29:”css_res_p_title_margin_bottom”;s:2:”10″;s:24:”css_res_p_cats_font_size”;s:2:”10″;s:26:”css_res_p_cats_line_height”;s:2:”10″;s:28:”css_res_p_cats_margin-bottom”;s:1:”0″;s:25:”css_res_p_staff_font_size”;s:2:”12″;s:27:”css_res_p_staff_line_height”;s:2:”12″;s:29:”css_res_p_staff_margin-bottom”;s:2:”10″;s:27:”css_res_p_partner_font_size”;s:2:”12″;s:29:”css_res_p_partner_line_height”;s:2:”12″;s:31:”css_res_p_partner_margin-bottom”;s:2:”10″;s:27:”css_res_p_excerpt_font_size”;s:2:”13″;s:29:”css_res_p_excerpt_line_height”;s:2:”22″;s:24:”css_res_p_excerpt_margin”;s:2:”22″;s:26:”css_res_p_button_font_size”;s:2:”11″;s:33:”css_res_p_button_padding_vertical”;s:2:”13″;s:35:”css_res_p_button_padding_horizontal”;s:2:”16″;s:27:”css_res_p_button_margin_top”;s:1:”0″;s:30:”css_res_p_button_margin_bottom”;s:1:”0″;s:30:”css_res_p_button_icon_size_svg”;s:2:”11″;s:28:”css_res_p_button_icon_margin”;s:1:”5″;s:17:”carousel_autoplay”;s:4:”4000″;s:18:”main_heading_title”;s:13:”CLICK TO EDIT”;s:21:”main_filter_title_all”;s:3:”All”;s:26:”css_main_heading_font_size”;s:2:”17″;s:28:”css_main_heading_font_weight”;s:3:”400″;s:31:”css_main_heading_letter_spacing”;s:1:”0″;s:28:”css_main_heading_line_height”;s:2:”37″;s:31:”css_main_heading_link_font_size”;s:2:”11″;s:33:”css_main_heading_link_font_weight”;s:3:”600″;s:36:”css_main_heading_link_letter_spacing”;s:1:”0″;s:33:”css_main_heading_link_padding_ver”;s:2:”10″;s:13:”view_all_link”;s:1:”#”;s:26:”css_main_heading_sep_color”;s:7:”#4f4f4f”;s:26:”css_main_heading_sep_style”;s:6:”dotted”;s:25:”css_heading_margin_bottom”;s:2:”20″;s:32:”css_res_t_main_heading_font_size”;s:2:”17″;s:34:”css_res_t_main_heading_line_height”;s:2:”37″;s:37:”css_res_t_main_heading_link_font_size”;s:2:”11″;s:39:”css_res_t_main_heading_link_padding_ver”;s:2:”10″;s:31:”css_res_t_heading_margin_bottom”;s:2:”20″;s:32:”css_res_p_main_heading_font_size”;s:2:”17″;s:34:”css_res_p_main_heading_line_height”;s:2:”37″;s:37:”css_res_p_main_heading_link_font_size”;s:2:”11″;s:39:”css_res_p_main_heading_link_padding_ver”;s:2:”10″;s:31:”css_res_p_heading_margin_bottom”;s:2:”20″;s:19:”css_filter_bg_color”;s:7:”#ffffff”;s:26:”css_filter_bg_color_active”;s:7:”#5890e5″;s:23:”css_filter_border_color”;s:7:”#e8e8e8″;s:30:”css_filter_border_color_active”;s:7:”#5890e5″;s:23:”css_filter_border_width”;s:1:”1″;s:22:”css_filter_border_trbl”;s:21:”top right bottom left”;s:24:”css_filter_border_radius”;s:1:”3″;s:16:”css_filter_color”;s:7:”#979797″;s:23:”css_filter_color_active”;s:7:”#ffffff”;s:20:”css_filter_font_size”;s:2:”11″;s:22:”css_filter_font_weight”;s:3:”700″;s:27:”css_filter_padding_vertical”;s:2:”12″;s:29:”css_filter_padding_horizontal”;s:2:”12″;s:19:”css_filter_position”;s:4:”left”;s:18:”css_filter_spacing”;s:2:”10″;s:24:”css_filter_margin_bottom”;s:2:”20″;s:25:”css_res_t_filter_position”;s:4:”left”;s:26:”css_res_t_filter_font_size”;s:2:”11″;s:33:”css_res_t_filter_padding_vertical”;s:2:”12″;s:35:”css_res_t_filter_padding_horizontal”;s:2:”12″;s:24:”css_res_t_filter_spacing”;s:2:”10″;s:35:”css_res_t_filter_item_margin_bottom”;s:1:”0″;s:30:”css_res_t_filter_margin_bottom”;s:2:”20″;s:25:”css_res_p_filter_position”;s:4:”left”;s:26:”css_res_p_filter_font_size”;s:2:”11″;s:33:”css_res_p_filter_padding_vertical”;s:2:”12″;s:35:”css_res_p_filter_padding_horizontal”;s:2:”12″;s:24:”css_res_p_filter_spacing”;s:2:”10″;s:35:”css_res_p_filter_item_margin_bottom”;s:1:”0″;s:30:”css_res_p_filter_margin_bottom”;s:2:”20″;s:18:”arrows_slide_speed”;s:3:”200″;s:15:”arrows_position”;s:5:”above”;s:19:”css_arrows_bg_color”;s:11:”transparent”;s:25:”css_arrows_bg_color_hover”;s:16:”rgb(237, 90, 90)”;s:23:”css_arrows_border_color”;s:16:”rgb(237, 90, 90)”;s:23:”css_arrows_border_width”;s:1:”1″;s:24:”css_arrows_border_radius”;s:1:”3″;s:16:”css_arrows_color”;s:16:”rgb(237, 90, 90)”;s:22:”css_arrows_color_hover”;s:18:”rgb(255, 255, 255)”;s:15:”css_arrows_size”;s:2:”30″;s:21:”css_arrows_arrow_size”;s:2:”14″;s:21:”css_arrows_margin_top”;s:1:”0″;s:23:”css_arrows_margin_right”;s:1:”0″;s:24:”css_arrows_margin_bottom”;s:2:”30″;s:23:”css_arrows_margint_left”;s:1:”0″;s:27:”css_arrows_aside_margin_top”;s:3:”-30″;s:33:”css_res_t_arrows_aside_margin_top”;s:3:”-20″;s:33:”css_res_p_arrows_aside_margin_top”;s:3:”-20″;s:19:”circles_slide_speed”;s:3:”800″;s:17:”css_circles_color”;s:18:”rgb(235, 237, 246)”;s:24:”css_circles_color_active”;s:16:”rgb(237, 90, 90)”;s:22:”css_circles_margin_top”;s:2:”20″;s:16:”css_circles_size”;s:2:”12″;s:19:”css_circles_spacing”;s:1:”5″;s:28:”css_res_t_circles_margin_top”;s:2:”20″;s:22:”css_res_t_circles_size”;s:1:”7″;s:25:”css_res_t_circles_spacing”;s:1:”3″;s:28:”css_res_p_circles_margin_top”;s:2:”20″;s:22:”css_res_p_circles_size”;s:1:”7″;s:25:”css_res_p_circles_spacing”;s:1:”3″;s:15:”pagination_text”;s:15:”Load More Items”;s:13:”css_pag_align”;s:4:”left”;s:20:”css_pag_border_width”;s:1:”0″;s:19:”css_pag_border_trbl”;s:21:”top right bottom left”;s:21:”css_pag_border_radius”;s:1:”0″;s:24:”css_pag_padding_vertical”;s:1:”0″;s:26:”css_pag_padding_horizontal”;s:1:”0″;s:28:”css_pag_item_bg_color_active”;s:7:”#5890e5″;s:34:”css_pag_item_bg_color_active_hover”;s:7:”#ffffff”;s:21:”css_pag_item_bg_color”;s:7:”#ffffff”;s:36:”css_pag_item_bg_color_inactive_hover”;s:7:”#ffffff”;s:25:”css_pag_item_border_color”;s:7:”#e8e8e8″;s:31:”css_pag_item_border_color_hover”;s:7:”#e8e8e8″;s:32:”css_pag_item_border_color_active”;s:7:”#5890e5″;s:25:”css_pag_item_border_width”;s:1:”1″;s:32:”css_pag_item_border_width_active”;s:1:”1″;s:24:”css_pag_item_border_trbl”;s:21:”top right bottom left”;s:26:”css_pag_item_border_radius”;s:1:”3″;s:25:”css_pag_item_color_active”;s:7:”#ffffff”;s:24:”css_pag_item_color_hover”;s:7:”#979797″;s:18:”css_pag_item_color”;s:7:”#979797″;s:33:”css_pag_item_color_inactive_hover”;s:7:”#979797″;s:22:”css_pag_item_font_size”;s:2:”11″;s:24:”css_pag_item_font_weight”;s:3:”700″;s:27:”css_pag_item_letter_spacing”;s:1:”0″;s:29:”css_pag_item_padding_vertical”;s:2:”12″;s:31:”css_pag_item_padding_horizontal”;s:2:”12″;s:20:”css_pag_item_spacing”;s:2:”10″;s:20:”css_pag_button_width”;s:12:”inline-block”;s:18:”css_pag_margin_top”;s:2:”30″;s:20:”css_pag_margin_right”;s:1:”0″;s:21:”css_pag_margin_bottom”;s:1:”0″;s:19:”css_pag_margin_left”;s:1:”0″;s:24:”css_res_t_pag_margin_top”;s:2:”30″;s:26:”css_res_t_pag_margin_right”;s:1:”0″;s:27:”css_res_t_pag_margin_bottom”;s:1:”0″;s:25:”css_res_t_pag_margin_left”;s:1:”0″;s:24:”css_res_p_pag_margin_top”;s:2:”30″;s:26:”css_res_p_pag_margin_right”;s:1:”0″;s:27:”css_res_p_pag_margin_bottom”;s:1:”0″;s:25:”css_res_p_pag_margin_left”;s:1:”0″;s:8:”css_anim”;s:4:”none”;s:14:”css_anim_delay”;s:1:”0″;s:17:”css_anim_duration”;s:3:”650″;s:15:”css_anim_easing”;s:4:”ease”;s:14:”css_anim_hover”;s:10:”dslcFadeIn”;s:14:”css_anim_speed”;s:3:”350″;s:15:”css_load_preset”;s:4:”none”;s:12:”element_type”;s:6:”module”;s:4:”last”;s:3:”yes”;s:13:”query_post_in”;b:0;s:17:”query_post_not_in”;b:0;s:8:”elements”;b:0;s:19:”thumb_resize_height”;b:0;s:18:”thumb_resize_width”;b:0;s:19:”css_main_box_shadow”;b:0;s:21:”css_staff_font_family”;b:0;s:23:”css_partner_font_family”;b:0;s:23:”css_excerpt_font_family”;b:0;s:22:”css_button_font_family”;b:0;s:14:”button_icon_id”;b:0;s:17:”button_inline_svg”;b:0;s:21:”css_button_icon_color”;b:0;s:23:”main_heading_link_title”;b:0;s:28:”css_main_heading_font_family”;b:0;s:33:”css_main_heading_link_font_family”;b:0;s:22:”css_filter_font_family”;b:0;s:23:”css_arrows_margin_group”;b:0;s:24:”css_pag_item_font_family”;b:0;s:20:”css_pag_margin_group”;b:0;s:26:”css_res_t_pag_margin_group”;b:0;s:26:”css_res_p_pag_margin_group”;b:0;s:15:”css_save_preset”;b:0;s:12:”custom_class”;b:0;}[/dslc_module_projects_output]

Contact me

E-mail: 

enrique.hortal@maastrichtuniversity.nl

 Skype:

enrique.hortal

Visitor address:

Paul-Henri Spaaklaan 1, 6229 EN

Maastricht, the Netherlands (room C4.020)

Tel.: +31 (0)43 38 83902