<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Theses | Andrea Del Prete</title><link>https://andreadelprete.github.io/theses/</link><atom:link href="https://andreadelprete.github.io/theses/index.xml" rel="self" type="application/rss+xml"/><description>Theses</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Thu, 02 Apr 2026 00:00:00 +0000</lastBuildDate><item><title>Bachelor Thesis</title><link>https://andreadelprete.github.io/theses/perception-aware-control/</link><pubDate>Thu, 02 Apr 2026 00:00:00 +0000</pubDate><guid>https://andreadelprete.github.io/theses/perception-aware-control/</guid><description>&lt;h1 id="perception-aware-control">Perception-Aware Control&lt;/h1>
&lt;p>Goal: control a robot manipulator while ensuring that the target remains within the field of view (FoV) of the camera mounted on the end-effector.&lt;/p>
&lt;h2 id="description">Description&lt;/h2>
&lt;p>The starting point is to control the robot with velocity-level differential inverse kinematics, i.e. using a Jacobian pseudo-inverse and sending joint velocity commands to the robot.
Then we need to add another task that consists in maintaining the target inside the field of view of the camera. This task is actually more important than the other, but it needs only to be active when the target has reached the limits of the camera&amp;rsquo;s FoV. Whenever that happens, we should consider as primary task moving the robot so that the target moves towards the center of the FoV. This task takes at most 2 DoF. The secondary task can be moving the end-effector towards the target.
If instead the target is not at the boundary of the FoV we can simply perform the task of moving the end-effector towards the target.&lt;/p>
&lt;h3 id="projecting-the-target-to-the-2d-image-space">Projecting the target to the 2D image space&lt;/h3>
&lt;p>To detect whether the target is on the boundary of the FoV we have to project it to 2D image space. Moreover, to implement the FoV task we must compute the Jacobian associated to the position of the target in 2D image space, which depends on both the position and orientation of the camera.&lt;/p>
&lt;h3 id="expected-results">Expected results&lt;/h3>
&lt;p>At the end of this work, the student should have produced:&lt;/p>
&lt;ul>
&lt;li>python code implementing the described control framework&lt;/li>
&lt;li>a document describing in detail the work, including the mathematics behind it&lt;/li>
&lt;li>a video showing the behavior of the robot in simulation&lt;/li>
&lt;li>a video showing the behavior of the real robot&lt;/li>
&lt;/ul></description></item><item><title>MyCobot Identification</title><link>https://andreadelprete.github.io/theses/mycobot-identification/</link><pubDate>Thu, 02 Apr 2026 00:00:00 +0000</pubDate><guid>https://andreadelprete.github.io/theses/mycobot-identification/</guid><description>&lt;h1 id="mycobot280-identification">MyCobot280 Identification&lt;/h1>
&lt;p>Goal: to identify the dynamic behavior of the robot manipulators MyCobot280.&lt;/p>
&lt;h2 id="description">Description&lt;/h2>
&lt;p>The robot manipulators MyCobot280 accept only joint position commands as control inputs.
Internally, these position commands are used as targets to generate interpolating trajectories, which are then internally used as references for the low-level motor controllers.&lt;/p>
&lt;p>In order to accurately simulate the behavior of the robot, we would like to identify this interpolation procedure. We can do this by collecting data on the robot and then trying to fit different trajectories to the collected data points. My current guess is that these are minimum-time trajectories with bounded velocity and acceleration.
To identify the maximum velocity for each joint, we can command them to move a long distance and observe their maximum velocity.&lt;/p>
&lt;h3 id="expected-results">Expected results&lt;/h3>
&lt;p>At the end of this work, the student should have produced:&lt;/p>
&lt;ul>
&lt;li>python code implementing an accurate simulator of the robot&lt;/li>
&lt;li>a document describing in detail the work, presenting and discussing the collected data, comparing the simulated behavior with the real robot data&lt;/li>
&lt;li>a video showing the behavior of the robot in simulation&lt;/li>
&lt;li>a video showing the behavior of the real robot&lt;/li>
&lt;/ul></description></item></channel></rss>