<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
  <channel>
    <title>DSpace Collection:</title>
    <link>http://hdl.handle.net/2268.2/19383</link>
    <description />
    <pubDate>Tue, 14 Apr 2026 22:27:46 GMT</pubDate>
    <dc:date>2026-04-14T22:27:46Z</dc:date>
    <item>
      <title>Evaluation of Spiking Neural Network (SNN) Models for Detection &amp; Classification Using FMCW Radar Data</title>
      <link>http://hdl.handle.net/2268.2/24951</link>
      <description>Title: Evaluation of Spiking Neural Network (SNN) Models for Detection &amp; Classification Using FMCW Radar Data
Abstract: Processing radar data presents significant challenges due to its high dimensionality, noise, and temporal complexity,&#xD;
which make extracting robust and efficient features difficult. Spiking neural networks (SNNs) offer a biologically inspired alternative to conventional artificial neural networks (ANNs), with the promise of energy efficiency and temporal&#xD;
information processing. This work aims to evaluate the feasibility of SNN models for classification and detection tasks&#xD;
on frequency-modulated continuous-wave (FMCW) radar data, particularly in the industrial context of BEA.&#xD;
To address this, a complete dataset of FMCW radar signals was created and automatically annotated using a dedicated tool developed for this project. The dataset was then analyzed to characterize the signals and their variability.&#xD;
Several SNN models were trained and evaluated on the tasks, alongside standard ANN baselines for comparison.&#xD;
The experimental protocol included various neural encoding strategies, rigorous training, and consistent evaluation&#xD;
to ensure fair benchmarking. Performance was assessed in terms of classification accuracy, energy consumption (both&#xD;
theoretical and practical), and suitability for embedded deployment.&#xD;
Results demonstrate that while ANNs slightly outperform SNNs in accuracy, SNNs offer substantial gains in energy&#xD;
efficiency, making them highly suitable for low-power applications. Furthermore, deployment tests confirm that SNNs&#xD;
can be effectively implemented on classical embedded hardware, offering a promising pathway for low-power radar based sensing.&#xD;
This thesis contributes :&#xD;
i An automatic annotation tool for FMCW radar data.&#xD;
ii A ready-to-use FMCW dataset for BEA applications.&#xD;
iii An empirical and conceptual understanding of SNN behavior on radar data.&#xD;
iv A detailed evaluation of SNN viability in terms of accuracy and energy consumption.&#xD;
v A proof-of-concept embedded implementation that demonstrates integration into existing BEA products.</description>
      <pubDate>Sun, 07 Sep 2025 22:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2268.2/24951</guid>
      <dc:date>2025-09-07T22:00:00Z</dc:date>
    </item>
    <item>
      <title>A Neuromorphic Approach to Slip Detection with 3D Magnetic Sensors</title>
      <link>http://hdl.handle.net/2268.2/24761</link>
      <description>Title: A Neuromorphic Approach to Slip Detection with 3D Magnetic Sensors
Abstract: Robust slip detection remains a major challenge in robotic manipulation, as it is essential for stable and adaptive interaction with objects of varying shapes, textures, and compliance. This thesis proposes a novel approach by designing a biologically inspired algorithm for real-time, asynchronous slip detection using low-cost, three-dimensional magnetic tactile sensors, while explicitly avoiding computationally heavy learning-based approaches.&#xD;
&#xD;
To achieve this, temporal signal processing constrained to first-order filters and saturation functions was combined with a network of simulated spiking neurons (Ribar et al. 2019), inspired by biological tactile sensing and designed for compatibility with future neuromorphic CMOS hardware. The algorithm was implemented as a ROS2 node to enable integration into robotic systems and was first evaluated in a controlled environment, followed by a more qualitative in scenarios inspired by real-world conditions.&#xD;
&#xD;
The results show that the proposed system can detect incipient slip with a latency below 30 ms, a performance comparable to human tactile response times and to state-of-the-art learning-based methods, which typically rely on higher-cost tactile sensors such as GelSight Mini (Jawale et al. 2024), or uSkin (Yan et al. 2022), whereas our approach uses low-cost, mass-manufacturable 3D magnetic sensors. Gross slip detection, however, was only achieved in controlled experiments at relatively high slip speeds. Robustness to noise during non-contact phases was improved through the inclusion of a contact-detection neuron. While the approach avoids computationally heavy learning methods, the ROS2 implementation required multi-threading to operate correctly, indicating that its computational requirements are reduced but not yet minimal.&#xD;
&#xD;
Despite these promising outcomes, several limitations were observed, including reduced sensitivity on smooth surfaces, occasional false positives, and difficulties in generalizing parameter settings across different objects and surface textures. In addition, hardware constraints restricted the simulation speed of neurons, thereby limited their spiking frequencies, which in turn prevented further improvements in delay and sensitivity.&#xD;
&#xD;
In conclusion, this work establishes a proof-of-concept for  biologically inspired slip detection in robotics. It demonstrates the feasibility of detecting slip events using low-cost sensors and a network of simulated spiking neurons with asynchronous, event-driven computation, laying the groundwork for future implementations on low-power neuromorphic hardware.</description>
      <pubDate>Sun, 07 Sep 2025 22:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2268.2/24761</guid>
      <dc:date>2025-09-07T22:00:00Z</dc:date>
    </item>
    <item>
      <title>Reliable robotic grasping for uncertain objects through virtual model control</title>
      <link>http://hdl.handle.net/2268.2/23305</link>
      <description>Title: Reliable robotic grasping for uncertain objects through virtual model control
Abstract: Dexterous robotic grasping remains a central challenge in robotics due to the high-dimensional nature of multi-fingered manipulators and the complex physical interactions involved in object manipulation. This thesis investigates the application of the Virtual Mechanisms (VM) control framework to enhance grasping capabilities in anthropomorphic robotic hands. Building on the principles of passivity-based impedance control, the VM approach introduces virtual mechanical elements such as springs, dampers, and inertial components interconnected in operational space to generate intuitive and modular control behaviors. &#xD;
The thesis proposes two main application approaches. &#xD;
The first one utilizes virtual mechanisms as simple and intuitive hand-centered trajectory planners to generate human-inspired grasping motions, employing a taxonomy-based approach to implement the following common grasp types: medium wrap, power sphere, and lateral pinch. &#xD;
The second one extends this approach toward object-centric behavior shaping, wherein virtual mechanisms are dynamically tailored to object geometry. &#xD;
Both strategies are implemented and tested on a Shadow Dexterous Hand robotic platform. Results indicate that virtual mechanisms offer a robust and versatile control paradigm, enhancing grasp robustness and adaptability while preserving stability. The thesis concludes with a proof-of-concept integration of position-based feedback into the VM framework, laying the groundwork for future developments. Overall, this work highlights the potential of virtual mechanisms as a promising alternative to traditional grasp control strategies in dexterous robotic manipulation.</description>
      <pubDate>Sun, 29 Jun 2025 22:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2268.2/23305</guid>
      <dc:date>2025-06-29T22:00:00Z</dc:date>
    </item>
    <item>
      <title>Swing Door Simulator</title>
      <link>http://hdl.handle.net/2268.2/23234</link>
      <description>Title: Swing Door Simulator
Abstract: Currently, no automated testing tool for sensors deployed on swing doors has been developed within BEA Sensors. All tests are performed manually by members of the R&amp;D team, resulting in a long process. Consequently, this extends the development process for new sensors and delays the validation of hardware or software upgrades to existing products. This thesis addresses these challenges by developing a user-friendly testing interface designed to automate and standardize the sensor validation process on swing doors.&#xD;
The developed tool enables the recording of movement data by extracting sensor readings in real-world conditions. The recorded data can then be edited to highlight specific characteristics of the motion. Afterwards, these recordings can be replayed on a servo-driven test door. Finally, each replay can be analyzed in detail to evaluate the performance of the tested sensor. The tool records data in a standardized format and saves it in a dedicated directory to create a comprehensive library of challenging test cases for swing doors. This provides a valuable resource for future sensor validation and development.&#xD;
This thesis explores the development process of the tool, explaining in detail the code architecture chosen to ensure maintainability, robustness, and a clear hierarchy of responsibilities. It also explains how the Festo servo motor is controlled via the CANopen protocol, as well as the methods used to extract data from the BEA Sensors.&#xD;
The tool is already operational and has demonstrated its ability to faithfully reproduce complex motion profiles and extract performance metrics from the replays. While certain limitations remain, such as the limited accuracy of the sensor and the significant system inertia, this testing platform provides a robust foundation for future automation and more advanced sensor validation.</description>
      <pubDate>Sun, 29 Jun 2025 22:00:00 GMT</pubDate>
      <guid isPermaLink="false">http://hdl.handle.net/2268.2/23234</guid>
      <dc:date>2025-06-29T22:00:00Z</dc:date>
    </item>
  </channel>
</rss>

