Jan. 2019 | Estancia de 3 meses destinado a la investigación del uso de Deep Learning para el estudio
+ Oct. 2018 – | The University of Manchester. Cognitive Robotics Lab. |
+ Jan. 2019 | Estancia de 3 meses destinado a la investigación del uso de Deep Learning para el estudio
de conceptos abstractos con el robot iCub (Manchester, UK). |
- Sept. 2011 – | Istituto Italiano di Tecnologia. Department of Robotics, Brain and Cognitive Sciences. |
- Dic. 2011 | Estancia de 3 meses destinado a la investigación del uso de Support Vector Machines y
+ | Sept. 2011 – | Istituto Italiano di Tecnologia. Department of Robotics, Brain and Cognitive Sciences. |
+ Dic. 2011 | Estancia de 3 meses destinado a la investigación del uso de Support Vector Machines y
Gaussianas mixtas para el control en fuerza del robot iCub (Génova, Italia). |
-
+
Workshop Organizer (2)
-
+
+
+
- Juan G. Victores, Lorenzo Natale, Eiichi Yoshida. Towards Humanoid Robots OS. HUMANOIDS.
- Cancun, Mexico. Nov 15. 2016. https://roboticslab-uc3m.github.io/workshop-humanoids2016/
+ Cancun, Mexico. Nov 15. 2016. https://roboticslab-uc3m.github.io/workshop-humanoids2016/
- Angelos Amditis, Konstantinos Loupos, Juan G. Victores. Autonomous Robotic Systems for Inspection
and Structural Assessment of Civil Underground Infrastructures. European Robotics Forum (ERF).
- Ljubljana, Slovenia. Mar 22. 2016. https://www.eu-robotics.net/robotics\_forum/upload/digest\_1-96\_without\_emails\_250ppi1.pdf
-
-
-
+ Ljubljana, Slovenia. Mar 22. 2016. https://www.eu-robotics.net/robotics\_forum/upload/digest\_1-96\_without\_emails\_250ppi1.pdf
+
Talks (2)
-
+
- Juan G. Victores. XGNITIVE: Avances hacia la generalización avanzada de acciones y sistemas
de imaginación en robótica. Technology Festival (Techfest). Universidad Rey Juan Carlos (URJC).
- 2017. https://www.eventbrite.es/e/registro-technology-festival-urjc-2017-28838850779?aff=es2#
+ 2017. https://www.eventbrite.es/e/registro-technology-festival-urjc-2017-28838850779?aff=es2#
- Angelos Amditis, Juan G. Victores, Fedi Francesco. Welcome and Introduction. Autonomous Robotic
Systems for Inspection and Structural Assessment of Civil Underground Infrastructures. European
- Robotics Forum (ERF). Ljubljana, Slovenia. Mar 22. 2016. https://www.eu-robotics.net/robotics\_forum/upload/digest\_1-96\_without\_emails\_250ppi1.pdf
+ Robotics Forum (ERF). Ljubljana, Slovenia. Mar 22. 2016. https://www.eu-robotics.net/robotics\_forum/upload/digest\_1-96\_without\_emails\_250ppi1.pdf
\ No newline at end of file
diff --git a/cv/JuanGVictoresCV.pdf b/cv/JuanGVictoresCV.pdf
index a8c6bc7..2ceda96 100644
Binary files a/cv/JuanGVictoresCV.pdf and b/cv/JuanGVictoresCV.pdf differ
diff --git a/cv/JuanGVictoresCV.tex b/cv/JuanGVictoresCV.tex
index a293205..de80db1 100644
--- a/cv/JuanGVictoresCV.tex
+++ b/cv/JuanGVictoresCV.tex
@@ -96,9 +96,10 @@ \section*{Book Chapters (8)}
\item \bibentry{balaguer2010robotic} [robot] [construction]
\end{enumerate}
-\section*{Conference Proceedings (50)}
+\section*{Conference Proceedings (51)}
\begin{enumerate}
\item \bibentry{fernandezfernandez2022neuralB} [robot] [xgnitive: cgda]
+ \item \bibentry{gago2020under-actuation} [robot] [sign-language]
\item \bibentry{sierragarcia2019neural} [robot] [control]
\item \bibentry{gago2019sequence} [robot] [sign-language]
\item \bibentry{estevez2019towards} [robot] [textiles: hanging]
diff --git a/cv/victores.bib b/cv/victores.bib
index 2d84129..f64becc 100644
--- a/cv/victores.bib
+++ b/cv/victores.bib
@@ -1,3 +1,22 @@
+@inproceedings{gago2020under-actuation,
+ abstract = {This paper presents a study on under-actuation modelling applied to robotic hands aimed at sign language representation. Prior studies using a simulated TEO humanoid robot for representing sign language have shown positive comprehension and satisfaction responses among the deaf and hearing impaired community. The under-actuated mechanics of the robotic fingers were not contemplated in the simulated model, thus the correspondence problem arises as the previous joint space positions cannot be directly sent to the physical system. In addition to the 3:1 and 2:1 ratio of the under-actuation of the finger mechanisms, tendons and springs involve stiffness and elasticity that are difficult or unfeasible to model, and justify the need for a data-driven approach. Three motor command generators using three different neural network models are analysed and evaluated. Two of the generators are trained in a supervised fashion, and the third involves variational self-supervision and a transformation upon the latent space. The simulated joint space positions are translated into motor commands for the physical embodied robot to represent a sign language dactylology, which is in turn evaluated by deaf and hearing impaired end-users.},
+ author = {Bartek
+and Victores Juan G.
+and Balaguer Carlos Gago Jennifer J.
+and Łukawski},
+ city = {Cham},
+ editor = {Paulo
+and Camacho David
+and Yin Hujun Analide Cesar
+and Novais},
+ isbn = {978-3-030-62365-4},
+ journal = {Intelligent Data Engineering and Automated Learning – IDEAL 2020},
+ pages = {239-251},
+ publisher = {Springer International Publishing},
+ title = {Under-Actuation Modelling in Robotic Hands via Neural Networks for Sign Language Representation with End-User Validation},
+ url = {https://doi.org/10.1007/978-3-030-62365-4_23},
+ year = {2020},
+}
@article{delatorre2024spasticsim,
abstract = {In neurorehabilitation, assessment of functional problems is essential to define optimal rehabilitation treatments. Usually, this assessment process requires distinguishing between impaired and non-impaired behavior of limbs. One of the common muscle motor disorders affecting limbs is spasticity, which is complicated to quantify objectively due to the complex nature of motor control. Thus, the lack of heterogeneous samples of patients constituting an acceptable amount of data is an obstacle which is relevant to understanding the behavior of spasticity and, consequently, quantifying it. In this article, we use the 3D creation suite Blender combined with the MBLab add-on to generate synthetic samples of human body models, aiming to be as sufficiently representative as possible to real human samples. Exporting these samples to OpenSim and performing four specific upper limb movements, we analyze the muscle behavior by simulating the six degrees of spasticity contemplated by the Modified Ashworth Scale (MAS). The complete dataset of patients and movements is open-source and available for future research. This approach advocates the potential to generate synthetic data for testing and validating musculoskeletal models.},
author = {Rubén de-la-Torre and Edwin Daniel Oña and Juan G. Victores and Alberto Jardón},
|