Skip to content

Commit e4b9978

Browse files
committed
Update
1 parent 81e7455 commit e4b9978

4 files changed

Lines changed: 50 additions & 63 deletions

File tree

3_biography.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ layout: page
55

66
*Last update: November 2, 2025*
77

8-
I am an incoming assistant professor (January 2026) at Carnegie Mellon University, in the Mechanical Engineering Department and with Courtesy Appointment with the Robotics Institute. Until then, I am working as a starting researcher in the WILLOW team, Inria Paris. Before that, until February 2024, I was a postdoc at the University of Toronto, working with Prof. Tim Barfoot, which I started in March 2022. I obtained my PhD from the Laboratory of Audiovisual Communications ([**LCAV**](https://lcav.epfl.ch)) at École polytechnique fédérale de Lausanne (**EPFL**), under the supervision of Prof. Martin Vetterli and Dr. Adam Scholefield.
8+
I am Assistant Professor at Carnegie Mellon University, in the Mechanical Engineering Department and I hold a courtesy appointment with the Robotics Institute. Until December 2025, I was working as a starting researcher in the WILLOW team, Inria Paris. Before that, until February 2024, I was a postdoc at the University of Toronto, working with Prof. Tim Barfoot, which I started in March 2022. I obtained my PhD from the Laboratory of Audiovisual Communications ([**LCAV**](https://lcav.epfl.ch)) at École polytechnique fédérale de Lausanne (**EPFL**), under the supervision of Prof. Martin Vetterli and Dr. Adam Scholefield.
99

1010
I hold a Master's degree of Mechnical Engineering with specialization in control and mechatronics from EPFL. I had the opportunity to conduct my Master's Thesis at the Autonomous Systems Lab ([**ASL**](https://asl.ethz.ch)) at the Eidgenössische Technische Hochschule Zürich (**ETHZ**), and I performed the third year of my Bachelor studies at the University of **Heriot Watt** in Edinburgh.
1111

Gemfile.lock

Lines changed: 13 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
GEM
22
remote: https://rubygems.org/
33
specs:
4-
addressable (2.8.7)
5-
public_suffix (>= 2.0.2, < 7.0)
4+
addressable (2.8.8)
5+
public_suffix (>= 2.0.2, < 8.0)
66
bibtex-ruby (6.2.0)
77
latex-decode (~> 0.0)
88
logger (~> 1.7)
@@ -14,30 +14,23 @@ GEM
1414
namae (~> 1.0)
1515
observer (< 1.0)
1616
open-uri (< 1.0)
17-
citeproc-ruby (2.1.2)
17+
citeproc-ruby (2.1.8)
1818
citeproc (~> 1.0, >= 1.0.9)
1919
csl (~> 2.0)
2020
observer (< 1.0)
2121
colorator (1.1.0)
2222
concurrent-ruby (1.3.5)
23-
csl (2.1.0)
24-
forwardable (~> 1.3)
25-
namae (~> 1.2)
26-
open-uri (< 1.0)
27-
rexml (~> 3.0)
28-
set (~> 1.1)
29-
singleton (< 1.0)
30-
time (< 1.0)
31-
csl-styles (2.0.1)
23+
csl (2.0.0)
24+
namae (~> 1.0)
25+
rexml
26+
csl-styles (2.0.2)
3227
csl (~> 2.0)
33-
date (3.4.1)
28+
date (3.5.1)
3429
em-websocket (0.5.3)
3530
eventmachine (>= 0.12.9)
3631
http_parser.rb (~> 0)
3732
eventmachine (1.2.7)
38-
ffi (1.17.2)
39-
ffi (1.17.2-arm64-darwin)
40-
ffi (1.17.2-x86_64-darwin)
33+
ffi (1.17.3)
4134
forwardable (1.3.3)
4235
forwardable-extended (2.6.0)
4336
http_parser.rb (0.8.0)
@@ -90,7 +83,7 @@ GEM
9083
uri
9184
pathutil (0.16.2)
9285
forwardable-extended (~> 2.6)
93-
public_suffix (6.0.2)
86+
public_suffix (5.1.1)
9487
racc (1.8.1)
9588
rb-fsevent (0.11.2)
9689
rb-inotify (0.11.1)
@@ -100,20 +93,17 @@ GEM
10093
safe_yaml (1.0.5)
10194
sassc (2.4.0)
10295
ffi (~> 1.9)
103-
set (1.1.2)
104-
singleton (0.3.0)
105-
stringio (3.1.7)
96+
stringio (3.2.0)
10697
terminal-table (1.8.0)
10798
unicode-display_width (~> 1.1, >= 1.1.1)
108-
time (0.4.1)
99+
time (0.4.2)
109100
date
110101
unicode-display_width (1.8.0)
111102
uri (1.0.3)
112103
webrick (1.9.1)
113104

114105
PLATFORMS
115106
arm64-darwin
116-
ruby
117107
x86_64-darwin
118108
x86_64-linux
119109

@@ -127,4 +117,4 @@ DEPENDENCIES
127117
webrick
128118

129119
BUNDLED WITH
130-
2.5.23
120+
2.3.22

_posts/2025-10-27-subspaces.md

Lines changed: 33 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -341,13 +341,43 @@ Although the timings are very imprecise (they include the time to setup the prob
341341

342342
</div>
343343

344+
### The Kernelized Version
345+
346+
An annoyance of the above method is the need to pick a basis function $\phi(\mathbf{x})$ manually. Since we only use samples of this basis function, it seems natural to instead seek a kernelized method, where the basis functions are implicitly defined through carefully chosen kernels. Such a treatment would also allow for using infinite-dimensional feature functions, such as the ones induced by Gaussian or Laplacian kernels. However, for the sake of brevity, we move such a discussion to a later blogpost.
347+
348+
Here, we focus on finite-dimensional feature functions. For example, for the space of polynomials, it is well-known that by chosing the kernel
349+
350+
$$
351+
k(\mathbf{x},\mathbf{y})=(1 + \mathbf{x}^\top\mathbf{y})^d,
352+
$$
353+
354+
the corresponding RKHS will include all polynomials up to degree $d$. In particular, the canonical feature map $\phi(\mathbf{x})$ is made of (appropriately scaled) monomials.
355+
356+
It can be shown (see, e.g., {% cite rudi_finding_2024 %}) that the solution to the problem (SOS), when evaluating at given samples $\mathbf{x}_i$, is of the form
357+
358+
$$
359+
\hat{\mathbf{H}} = \sum_{i,j=1}^n \mathbf{F}_{ij} \phi(\mathbf{x}_i)\phi(\mathbf{x}_j)^\top = \mathbf{\Phi}^\top \mathbf{F} \mathbf{\Phi},
360+
$$
361+
362+
where $\mathbf{F}$ is an unknown coefficients matrix and we have introduced $\mathbf{\Phi}^\top=[\phi(\mathbf{x}_1), \cdots, \phi(\mathbf{x}_n)]$.
363+
Plugging this into the constraints of the original (SOS) problem, we obtain
364+
365+
$$
366+
\phi(\mathbf{x}_i)^\top \mathbf{H}\phi(\mathbf{x}_i) =
367+
\phi(\mathbf{x}_i)^\top \mathbf{\Phi}^\top\mathbf{F}\mathbf{\Phi}\phi(\mathbf{x}_i) =
368+
\mathbf{k}_i^\top \mathbf{F}\mathbf{k}_i
369+
$$
370+
371+
where we introduced $\mathbf{k}_i :=
372+
[\phi(\mathbf{x}_i)^\top \phi(\mathbf{x}_1), \cdots \phi(\mathbf{x}_i)^\top\phi(\mathbf{x}_n))]$. We observe that $\phi$ only appears in inner products in this new constraint formulation, meaning that we rewrite the constraint as a function of $k(\mathbf{x}, \mathbf{y})=\phi(\mathbf{x})^\top\phi(\mathbf{y})$, leading to $\mathbf{k}_i=[k(\mathbf{x}_i, \mathbf{x}_1), \cdots, k(\mathbf{x}_i, \mathbf{x}_n)]$.
373+
374+
This kernelization of the problem allows for different kernels to be applied, and to address questions like the choice of the optimal kernel function, based on the function class and types of samples. We will dive into this question further in the next blog post.
375+
344376
### Conclusion and Discussion
345377

346378
We've seen that by taking a subspace perspective, we can derive alternative but equivalent relaxations for polynomial optimization problems. Depending on the dimensions of the subspace $\mathcal{V}$ and its complement $\mathcal{V}^\perp$, one form might be more computationally efficient than the other. For instance, if the nullspace $\mathcal{V}^\perp$ has a very small dimension, the primal kernel form might have far fewer constraints than the image form.
347379

348-
An obvious limitation of this approach is that it cannot easily deal with inequality constraints. However, I would argue that at least the equality-constrained part can handled in a very elegant way through this subspace view.
349-
350-
To complete the picture, a natural next step would be to define the matrix $\mathbf{C}$ from samples directly, and to explore alternative bases as opposed to the monomial basis. I am planning to treat these topics in a follow-up blogpost.
380+
An obvious limitation of this approach is that it cannot easily deal with inequality constraints. However, at least the equality-constrained part of polynomial optimization problems can handled in a very elegant way through this subspace view.
351381

352382
### Appendix 1: Verification via the Duals {#appendix1}
353383

@@ -437,37 +467,6 @@ The constraint $\langle \mathbf{A}\_0, \mathbf{X} \rangle = 1$ is a standard nor
437467

438468
By enforcing this on the relaxed variable $\mathbf{X}$, we are essentially saying that the underlying measure $\mu$ is a probability measure, i.e., $\int d\mu(x) = 1$. This prevents the trivial solution where $\mathbf{X}=0$.
439469

440-
### The Kernelized Version
441-
442-
An annoyance of the above method is the need to pick a basis function $\phi(\mathbf{x})$ manually. Since we only use samples of this basis function, it seems natural to instead seek a kernelized method, where the basis functions are implicitly defined through carefully chosen kernels. Such a treatment would also allow for using infinite-dimensional feature functions, such as the ones induced by Gaussian or Laplacian kernels. However, for the sake of brevity, we move such a discussion to a later blogpost.
443-
444-
Here, we focus on finite-dimensional feature functions. For example, for the space of polynomials, it is well-known that by chosing the kernel
445-
446-
$$
447-
k(\mathbf{x},\mathbf{y})=(1 + \mathbf{x}^\top\mathbf{y})^d,
448-
$$
449-
450-
the corresponding RKHS will include all polynomials up to degree $d$. In particular, the canonical feature map $\phi(\mathbf{x})$ is made of (appropriately scaled) monomials.
451-
452-
It can be shown (see, e.g., {% cite rudi_finding_2024 %}) that the solution to the problem (SOS), when evaluating at given samples $\mathbf{x}_i$, is of the form
453-
454-
$$
455-
\hat{\mathbf{H}} = \sum_{i,j=1}^n \mathbf{F}_{ij} \phi(\mathbf{x}_i)\phi(\mathbf{x}_j)^\top = \mathbf{\Phi}^\top \mathbf{F} \mathbf{\Phi},
456-
$$
457-
458-
where $\mathbf{F}$ is an unknown coefficients matrix and we have introduced $\mathbf{\Phi}^\top=[\phi(\mathbf{x}_1), \cdots, \phi(\mathbf{x}_n)]$.
459-
Plugging this into the constraints of the original (SOS) problem, we obtain
460-
461-
$$
462-
\phi(\mathbf{x}_i)^\top \mathbf{H}\phi(\mathbf{x}_i) =
463-
\phi(\mathbf{x}_i)^\top \mathbf{\Phi}^\top\mathbf{F}\mathbf{\Phi}\phi(\mathbf{x}_i) =
464-
\mathbf{k}_i^\top \mathbf{F}\mathbf{k}_i
465-
$$
466-
467-
where we introduced $\mathbf{k}_i :=
468-
[\phi(\mathbf{x}_i)^\top \phi(\mathbf{x}_1), \cdots \phi(\mathbf{x}_i)^\top\phi(\mathbf{x}_n))]$. We observe that $\phi$ only appears in inner products in this new constraint formulation, meaning that we rewrite the constraint as a function of $k(\mathbf{x}, \mathbf{y})=\phi(\mathbf{x})^\top\phi(\mathbf{y})$, leading to $\mathbf{k}_i=[k(\mathbf{x}_i, \mathbf{x}_1), \cdots, k(\mathbf{x}_i, \mathbf{x}_n)]$.
469-
470-
This kernelization of the problem allows for different kernels to be applied, and to address questions like the choice of the optimal kernel function, based on the function class and types of samples. We will dive into this question further in the next blog post.
471470

472471
### Bibliography
473472

index.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -4,18 +4,16 @@ layout: page
44
image: profile-crop.jpg
55
---
66

7-
I am an incoming assistant professor at CMU Mechanical Engineering, with courtesy appointment with the CMU Robotics Institute. My research interest is to make autonomous robots more capable and trustworthy by exploiting recent advances in optimization and artificial intelligence. I am currently a starting researcher in the Willow research group at Inria Paris. Until February 2024, I was a postdoc at the University of Toronto in the Robotics Institute, working with Timothy D. Barfoot on certifiably optimal state estimation. Before that, I obtained my Ph.D. in the AudioVisual Communications Laboratory (LCAV) at École polytechnique fédérale de Lausanne (EPFL) under the supervision of Martin Vetterli and Adam Scholefield, working on non-visual spatial perception.
7+
I am Assistant Professor in Mechanical Engineering, Carnegie Mellon University, and hold a courtesy appointment with the Robotics Institute. My research interest is to make autonomous robots more capable and trustworthy by exploiting recent advances in optimization and artificial intelligence. Until December 2025, I was a starting researcher in the Willow research group at Inria Paris. Until February 2024, I was a postdoc at the University of Toronto in the Robotics Institute, working with Timothy D. Barfoot on certifiably optimal state estimation. Before that, I obtained my Ph.D. in the AudioVisual Communications Laboratory (LCAV) at École polytechnique fédérale de Lausanne (EPFL) under the supervision of Martin Vetterli and Adam Scholefield, working on non-visual spatial perception.
88

99
[Google Scholar](https://scholar.google.com/citations?hl=en&user=45QAZbUAAAAJ) / [LinkedIn](https://www.linkedin.com/in/duembgen/)
1010

1111

1212
### Working with me
1313

14+
Undergraduate / Master's Research Students: If you are passionate about optimization for robotics, and would like to work with me in January 2026 or September 2026 semester, please get in touch via e-mail with your CV and a short research statement. You can also browse [this page](https://www.meche.engineering.cmu.edu/education/undergraduate-education/undergraduate-experience/projects/index.html) (CMU internal) for open positions.
1415

15-
**CMU MechE Research Students**: I am happy to co-advise students starting September 2025, if there is a strong fit. Don't hesitate to get in touch (see below).
16-
17-
If you are passionate about optimization for robotics, and would like to work with me at CMU starting January 2026 or September 2026, please get in touch via e-mail with your CV and a short research statement.
18-
16+
Postdocs: I am always open to interview strong candidates. In particular, if you would like to apply to join my lab with the [Carnegie Bosch Institute postdoctoral fellowship](https://carnegiebosch.cmu.edu/fellowships/index.html) (deadline is January 31, 2026), please get in touch.
1917

2018
### Latest news
2119

0 commit comments

Comments
 (0)