You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: 3_biography.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ layout: page
5
5
6
6
*Last update: November 2, 2025*
7
7
8
-
I am an incoming assistant professor (January 2026) at Carnegie Mellon University, in the Mechanical Engineering Department and with Courtesy Appointment with the Robotics Institute. Until then, I am working as a starting researcher in the WILLOW team, Inria Paris. Before that, until February 2024, I was a postdoc at the University of Toronto, working with Prof. Tim Barfoot, which I started in March 2022. I obtained my PhD from the Laboratory of Audiovisual Communications ([**LCAV**](https://lcav.epfl.ch)) at École polytechnique fédérale de Lausanne (**EPFL**), under the supervision of Prof. Martin Vetterli and Dr. Adam Scholefield.
8
+
I am Assistant Professor at Carnegie Mellon University, in the Mechanical Engineering Department and I hold a courtesy appointment with the Robotics Institute. Until December 2025, I was working as a starting researcher in the WILLOW team, Inria Paris. Before that, until February 2024, I was a postdoc at the University of Toronto, working with Prof. Tim Barfoot, which I started in March 2022. I obtained my PhD from the Laboratory of Audiovisual Communications ([**LCAV**](https://lcav.epfl.ch)) at École polytechnique fédérale de Lausanne (**EPFL**), under the supervision of Prof. Martin Vetterli and Dr. Adam Scholefield.
9
9
10
10
I hold a Master's degree of Mechnical Engineering with specialization in control and mechatronics from EPFL. I had the opportunity to conduct my Master's Thesis at the Autonomous Systems Lab ([**ASL**](https://asl.ethz.ch)) at the Eidgenössische Technische Hochschule Zürich (**ETHZ**), and I performed the third year of my Bachelor studies at the University of **Heriot Watt** in Edinburgh.
Copy file name to clipboardExpand all lines: _posts/2025-10-27-subspaces.md
+33-34Lines changed: 33 additions & 34 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -341,13 +341,43 @@ Although the timings are very imprecise (they include the time to setup the prob
341
341
342
342
</div>
343
343
344
+
### The Kernelized Version
345
+
346
+
An annoyance of the above method is the need to pick a basis function $\phi(\mathbf{x})$ manually. Since we only use samples of this basis function, it seems natural to instead seek a kernelized method, where the basis functions are implicitly defined through carefully chosen kernels. Such a treatment would also allow for using infinite-dimensional feature functions, such as the ones induced by Gaussian or Laplacian kernels. However, for the sake of brevity, we move such a discussion to a later blogpost.
347
+
348
+
Here, we focus on finite-dimensional feature functions. For example, for the space of polynomials, it is well-known that by chosing the kernel
the corresponding RKHS will include all polynomials up to degree $d$. In particular, the canonical feature map $\phi(\mathbf{x})$ is made of (appropriately scaled) monomials.
355
+
356
+
It can be shown (see, e.g., {% cite rudi_finding_2024 %}) that the solution to the problem (SOS), when evaluating at given samples $\mathbf{x}_i$, is of the form
[\phi(\mathbf{x}_i)^\top \phi(\mathbf{x}_1), \cdots \phi(\mathbf{x}_i)^\top\phi(\mathbf{x}_n))]$. We observe that $\phi$ only appears in inner products in this new constraint formulation, meaning that we rewrite the constraint as a function of $k(\mathbf{x}, \mathbf{y})=\phi(\mathbf{x})^\top\phi(\mathbf{y})$, leading to $\mathbf{k}_i=[k(\mathbf{x}_i, \mathbf{x}_1), \cdots, k(\mathbf{x}_i, \mathbf{x}_n)]$.
373
+
374
+
This kernelization of the problem allows for different kernels to be applied, and to address questions like the choice of the optimal kernel function, based on the function class and types of samples. We will dive into this question further in the next blog post.
375
+
344
376
### Conclusion and Discussion
345
377
346
378
We've seen that by taking a subspace perspective, we can derive alternative but equivalent relaxations for polynomial optimization problems. Depending on the dimensions of the subspace $\mathcal{V}$ and its complement $\mathcal{V}^\perp$, one form might be more computationally efficient than the other. For instance, if the nullspace $\mathcal{V}^\perp$ has a very small dimension, the primal kernel form might have far fewer constraints than the image form.
347
379
348
-
An obvious limitation of this approach is that it cannot easily deal with inequality constraints. However, I would argue that at least the equality-constrained part can handled in a very elegant way through this subspace view.
349
-
350
-
To complete the picture, a natural next step would be to define the matrix $\mathbf{C}$ from samples directly, and to explore alternative bases as opposed to the monomial basis. I am planning to treat these topics in a follow-up blogpost.
380
+
An obvious limitation of this approach is that it cannot easily deal with inequality constraints. However, at least the equality-constrained part of polynomial optimization problems can handled in a very elegant way through this subspace view.
351
381
352
382
### Appendix 1: Verification via the Duals {#appendix1}
353
383
@@ -437,37 +467,6 @@ The constraint $\langle \mathbf{A}\_0, \mathbf{X} \rangle = 1$ is a standard nor
437
467
438
468
By enforcing this on the relaxed variable $\mathbf{X}$, we are essentially saying that the underlying measure $\mu$ is a probability measure, i.e., $\int d\mu(x) = 1$. This prevents the trivial solution where $\mathbf{X}=0$.
439
469
440
-
### The Kernelized Version
441
-
442
-
An annoyance of the above method is the need to pick a basis function $\phi(\mathbf{x})$ manually. Since we only use samples of this basis function, it seems natural to instead seek a kernelized method, where the basis functions are implicitly defined through carefully chosen kernels. Such a treatment would also allow for using infinite-dimensional feature functions, such as the ones induced by Gaussian or Laplacian kernels. However, for the sake of brevity, we move such a discussion to a later blogpost.
443
-
444
-
Here, we focus on finite-dimensional feature functions. For example, for the space of polynomials, it is well-known that by chosing the kernel
the corresponding RKHS will include all polynomials up to degree $d$. In particular, the canonical feature map $\phi(\mathbf{x})$ is made of (appropriately scaled) monomials.
451
-
452
-
It can be shown (see, e.g., {% cite rudi_finding_2024 %}) that the solution to the problem (SOS), when evaluating at given samples $\mathbf{x}_i$, is of the form
[\phi(\mathbf{x}_i)^\top \phi(\mathbf{x}_1), \cdots \phi(\mathbf{x}_i)^\top\phi(\mathbf{x}_n))]$. We observe that $\phi$ only appears in inner products in this new constraint formulation, meaning that we rewrite the constraint as a function of $k(\mathbf{x}, \mathbf{y})=\phi(\mathbf{x})^\top\phi(\mathbf{y})$, leading to $\mathbf{k}_i=[k(\mathbf{x}_i, \mathbf{x}_1), \cdots, k(\mathbf{x}_i, \mathbf{x}_n)]$.
469
-
470
-
This kernelization of the problem allows for different kernels to be applied, and to address questions like the choice of the optimal kernel function, based on the function class and types of samples. We will dive into this question further in the next blog post.
Copy file name to clipboardExpand all lines: index.md
+3-5Lines changed: 3 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,18 +4,16 @@ layout: page
4
4
image: profile-crop.jpg
5
5
---
6
6
7
-
I am an incoming assistant professor at CMU Mechanical Engineering, with courtesy appointment with the CMU Robotics Institute. My research interest is to make autonomous robots more capable and trustworthy by exploiting recent advances in optimization and artificial intelligence. I am currently a starting researcher in the Willow research group at Inria Paris. Until February 2024, I was a postdoc at the University of Toronto in the Robotics Institute, working with Timothy D. Barfoot on certifiably optimal state estimation. Before that, I obtained my Ph.D. in the AudioVisual Communications Laboratory (LCAV) at École polytechnique fédérale de Lausanne (EPFL) under the supervision of Martin Vetterli and Adam Scholefield, working on non-visual spatial perception.
7
+
I am Assistant Professor in Mechanical Engineering, Carnegie Mellon University, and hold a courtesy appointment with the Robotics Institute. My research interest is to make autonomous robots more capable and trustworthy by exploiting recent advances in optimization and artificial intelligence. Until December 2025, I was a starting researcher in the Willow research group at Inria Paris. Until February 2024, I was a postdoc at the University of Toronto in the Robotics Institute, working with Timothy D. Barfoot on certifiably optimal state estimation. Before that, I obtained my Ph.D. in the AudioVisual Communications Laboratory (LCAV) at École polytechnique fédérale de Lausanne (EPFL) under the supervision of Martin Vetterli and Adam Scholefield, working on non-visual spatial perception.
Undergraduate / Master's Research Students: If you are passionate about optimization for robotics, and would like to work with me in January 2026 or September 2026 semester, please get in touch via e-mail with your CV and a short research statement. You can also browse [this page](https://www.meche.engineering.cmu.edu/education/undergraduate-education/undergraduate-experience/projects/index.html) (CMU internal) for open positions.
14
15
15
-
**CMU MechE Research Students**: I am happy to co-advise students starting September 2025, if there is a strong fit. Don't hesitate to get in touch (see below).
16
-
17
-
If you are passionate about optimization for robotics, and would like to work with me at CMU starting January 2026 or September 2026, please get in touch via e-mail with your CV and a short research statement.
18
-
16
+
Postdocs: I am always open to interview strong candidates. In particular, if you would like to apply to join my lab with the [Carnegie Bosch Institute postdoctoral fellowship](https://carnegiebosch.cmu.edu/fellowships/index.html) (deadline is January 31, 2026), please get in touch.
0 commit comments