HW3. Virtual Sensors¶
Due date: 2026-02-05 23:59.
Recommended Resources:
- SF: Sensing and Filtering, Steven M. LaValle, PDF
- Recorded Lectures and Slides
3D Preimage Visualization¶
In this part of the assignment, you will explore preimages of virtual sensors for a planar mobile robot with heading.
Rather than reasoning purely algebraically, you will use an interactive visualization tool to build geometric intuition about how different sensor mappings partition the state space. You are strongly encouraged to interact with the visualizer extensively before answering the questions that follow below:
Problem Setup¶
For questions in this part of the homework, consider a point-sized mobile robot moving in a planar environment:
E \subset \mathbb{R}^2,
\\\
which is either:
- a square:
E = [-1,1]^2,
\\\
- or a disk:
E = {(x,y)\in\mathbb{R}^2 \mid x^2+y^2 \le 1}.
\\\
The robot state is:
x = (x,y,\theta) \in X = E \times S^1,
\\\
where π β πΒΉ denotes the robotβs heading.
A virtual sensor is modeled as a mapping:
h : X \to Y,
\\\
where the observation space π β β depends on the sensor definition.
Sensor Models¶
The visualizer implements the following sensor mappings:
- Sensor πβ: Distance to the closest boundary. This sensor measures the shortest distance from the robotβs position to the boundary of the environment, βπΈ, regardless of orientation:
h_1(x,y,\theta) = \operatorname{dist}((x,y), \partial E).
\\\
- Sensor πβ: Distance along heading. This sensor measures how far the robot is free to move straight ahead in the direction of its heading π:
β_2(x,y,\theta)= \inf \{
t > 0 | (x,y) + t(\cos \theta,\sin \theta) \notin E.
\}.
\\\
Here, inf denotes the infimum of a set, which means: the greatest value that is less than or equal to every element of the set. In this context, ββ(π₯, π¦, π) represents the distance from position (π₯, π¦) along heading π, at which the robot is about to leave the environment πΈ.
- Sensor πβ: Distance behind the robot. This sensor measures the distance to the boundary directly behind the robot, i.e., along the direction π + π:
h_3(x,y,\theta)= h_2(x,y,\theta+\pi).
\\\
- Sensor πβ: Corridor width along heading. This sensor measures the total free space along the robotβs forwardβbackward direction by summing the distance in front and behind:
h_4(x,y,\theta) = h_2(x,y,\theta) + h_3(x,y,\theta).
\\\
- Sensor πβ : Two-ray aperture sensor. This sensor measures the distance to the closest boundary along two rays at angles π + πΌ and π β πΌ, and reports the smaller of the two distances:
h_5(x,y,\theta) = \min\{
h_2(x,y,\theta+\alpha),
h_2(x,y,\theta-\alpha)
\}.
β οΈ Note About the Visualizer¶
As seen in previous assignments, numerically computing roots of real-valued functions is subject to sampling limitations and numerical error. For the same reasons, this visualizer provides only a rough approximation of a preimage. It may miss some valid states or include extra ones.
Therefore, while the visualizer is useful for building intuition, use it as guidance while still applying analytical reasoning to answer the following questions correctly.
Visualizer Recommended Use¶
- Choose the environment first. Select either the disk or the square environment.
- Select a sensor. Choose one of the five sensors ββ through ββ .
- Set a sensor value and compute. Enter a value of the sensor output and click
Compute. The visualizer will numerically estimate the preimage of that value by sampling: - all (π₯, π¦) positions in the environment, and
- all heading angles π β πΒΉ.
- Choose the sampling resolution. You can control the resolution using:
- πβα΅§ (sampling over position), and
- ππ (sampling over heading).
- Larger values produce a more detailed approximation, but they also take longer to compute.
- Recommendation: use values no larger than about 100 for both πβα΅§ and ππ.
- Animate preimages (optional). You may display an animation of preimages for sensor values ranging from ββα΅’β to ββββ.
- Select ββα΅’β, ββββ,and how many intermediate steps are shown, and click
Play Animation. - Example: an animation for sensor ββ is shown below.
- Change the viewing angle. You can rotate and manipulate the 3D view to observe preimages from different perspectives.
- This is especially useful for building intuition about the geometry and dimension of preimage sets.
- For example, the same preimage can appear very different when viewed from another angle.
%β
βββ»ΒΉ(0) = βπΈ Γ πΒΉ.
%ββββ»ΒΉ(0) = πΈ Γ πΒΉ.
%β
βββ»ΒΉ(1) = {(0, 0)} Γ πΒΉ.
%ββββ»ΒΉ(1) = {(π₯,π¦) β£ π₯Β² + π¦Β² = 1} Γ πΒΉ.
%β
βββ»ΒΉ(2) = β
.
%β
βββ»ΒΉ(1/2) = {(π₯,π¦,π) β£ π₯Β² + π¦Β² = 1/4}.
%β
The dimension of preimage βββ»ΒΉ(1/2) is 2.
%β
The range of ββ is exactly [0,1].
%βThe dimension of all preimages is 2.
%β
The set {(π₯,π¦,π) β£ ββ(π₯,π¦,π) > 0} equals int(πΈ) Γ πΒΉ, in which int(πΈ) denotes all points strictly inside πΈ, not on the boundary.
%β
β πβ β πβ, if (π₯, π¦, πβ) β βββ»ΒΉ, then (π₯, π¦, πβ) β βββ»ΒΉ.
%β
βββ»ΒΉ(0) = βπΈ Γ πΒΉ.
%β βββ»ΒΉ(0) = πΈ Γ πΒΉ.
%β
βββ»ΒΉ(1) = {(0, 0)} Γ πΒΉ.
%β βββ»ΒΉ(1) = [β1,1] Γ {0} Γ πΒΉ.
%β
βββ»ΒΉ(2) = β
.
%β
[β1, 1] Γ {-1} Γ πΒΉ β βββ»ΒΉ(0).
%ββββ»ΒΉ(1/2) = [β1/2, 1/2]Β² Γ πΒΉ.
%β
The dimension of preimage βββ»ΒΉ(1/2) is 2.
%β
The range of ββ is exactly [0,1].
%βThe dimension of all preimages is 2.
%β
The set {(π₯,π¦,π) β£ ββ(π₯,π¦,π) > 0} equals int(πΈ) Γ πΒΉ, in which int(πΈ) denotes all points strictly inside πΈ, not on the boundary.
%β
β πβ β πβ, if (π₯, π¦, πβ) β βββ»ΒΉ, then (π₯, π¦, πβ) β βββ»ΒΉ.
%β βββ»ΒΉ(0) = βπΈ Γ πΒΉ.
%β βββ»ΒΉ(0) = πΈ Γ πΒΉ.
%β βββ»ΒΉ(1) = {(0, 0)} Γ πΒΉ.
%β
{(0, 0)} Γ πΒΉ β βββ»ΒΉ(π£).
%β
βββ»ΒΉ(1) can be expressed as the union over π β πΒΉ of planar circular arcs, each of which is located in a plane parallel to the (π₯,π¦)-plane.
%β βββ»ΒΉ(2) = β
.
%β
βββ»ΒΉ(1/2) can be expressed as the union over π β πΒΉ of planar circular arcs, each of which is located in a plane parallel to the (π₯,π¦)-plane.
%β
The range of ββ is exactly [0,2].
%β The dimension of all preimages is 2.
%β The set {(π₯,π¦,π) β£ ββ(π₯,π¦,π) > 0} equals int(πΈ) Γ πΒΉ, in which int(πΈ) denotes all points strictly inside πΈ, not on the boundary.
%β β πβ β πβ, if (π₯, π¦, πβ) β βββ»ΒΉ, then (π₯, π¦, πβ) β βββ»ΒΉ.
%β βββ»ΒΉ(0) = βπΈ Γ πΒΉ.
%β βββ»ΒΉ(0) = πΈ Γ πΒΉ.
%β βββ»ΒΉ(1) = {(0, 0)} Γ πΒΉ.
%β {(0, 0)} Γ πΒΉ β βββ»ΒΉ(π£).
%β
βββ»ΒΉ(1) can be expressed as the union over π β πΒΉ of line segments, each of which is located in a plane parallel to the (π₯,π¦)-plane.
%β βββ»ΒΉ(2) = β
.
%β
βββ»ΒΉ(1/2) can be expressed as the union over π β πΒΉ of line segments, each of which is located in a plane parallel to the (π₯,π¦)-plane.
%β The range of ββ is exactly [0,1].
%β The dimension of all preimages is 2.
%β βββ»ΒΉ(0) = βπΈ Γ πΒΉ.
%β
βββ»ΒΉ(0) = β
.
%β βββ»ΒΉ(2) = β
.
%β
{(0, 0)} Γ πΒΉ β βββ»ΒΉ(2).
%β
The set {(π₯,π¦,π) β£ ββ(π₯,π¦,π) = 2} consists exactly of states for which the line through (π₯,π¦) in direction π passes through the origin.
%β ββ is independent of π (i.e., changing π does not change the value).
%β For every (π₯,π¦) β πΈ and every π β πΒΉ, we have ββ(π₯,π¦,π) β₯ 1.
%β The set {(π₯,π¦,π) β£ ββ(π₯,π¦,π) > 0} equals int(πΈ) Γ πΒΉ.
%β
βββ»ΒΉ(1) can be expressed as the union over π β πΒΉ of line segments, each of which is located in a plane parallel to the (π₯,π¦)-plane.
%β ββ βΌ ββ.
%β ββ βΌ ββ.
%β ββ βΌ ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
%β ββ βͺ° ββ.
% β
Sensors ββ, ββ, ββ, ββ are incomparable.
%β βββ βͺ° ββ.
%β ββ βͺ° βββ.
%β
βββ βͺ° ββ.
%β ββ βͺ° βββ.
%β
βββ βͺ° ββ.
%β ββ βͺ° βββ.
%β
βββ βͺ° ββ.
%β ββ βͺ° βββ.
%β ββ
β»ΒΉ(0) = βπΈ Γ πΒΉ.
%β ββ
β»ΒΉ(0) = πΈ Γ πΒΉ.
%β
{(0, 0)} Γ πΒΉ β ββ
β»ΒΉ(1).
%β
ββ
β»ΒΉ(1) can be expressed as the union over π β πΒΉ of planar circular arcs.
%β
ββ
β»ΒΉ(1/2) can be expressed as the union over π β πΒΉ of planar circular arcs.
%β The range of ββ
is exactly [0, 1].
%β ββ
is independent of π.
%β The set {(π₯,π¦,π) β£ ββ
(π₯,π¦,π) > 0} equals int(πΈ) Γ πΒΉ.
%β
ββ
β»ΒΉ(1) = βββ»ΒΉ(1) β© βββ»ΒΉ(1) for πΌ = 90Β°.
%β
ββ
β»ΒΉ(1) = βββ»ΒΉ(1) for πΌ = 0Β°.
%β
ββ
βΌ ββ, for πΌ = 0Β°.
%β ββ
βΌ ββ, for πΌ = 0Β°.
%β
ββ
βͺ° ββ, for πΌ = 0Β°.
%β
ββ βͺ° ββ
, for πΌ = 0Β°.
%β ββ
βΌ ββ, for πΌ = 90Β°.
%β ββ
βΌ ββ, for πΌ = 90Β°.
%β ββ
βͺ° ββ, for πΌ = 90Β°.
%β ββ βͺ° ββ
, for πΌ = 90Β°.
%β ββ
βΌ ββ, for πΌ = 90Β°.
%β ββ βͺ° ββ
, for πΌ = 90Β°.
%β
The sensor output β is independent of π.
%β
For any environment πΈ β {πΈβ, πΈβ}, β(π,π,πΈ) = 0 if and only if π β βπΈ.
%βββ»ΒΉ(0) contains all states with π β βπΈβ βͺ βπΈβ.
%β
ββ»ΒΉ(0) = (βπΈβ Γ πΒΉ Γ {πΈβ}) βͺ (βπΈβ Γ πΒΉ Γ {πΈβ}).
%β
If β(π,π,πΈ) = 1, then necessarily π = (0, 0).
%βIf β(π,π,πΈ) = 1, then necessarily πΈ = πΈβ.
%β
There exist two different states π₯β = (π,π,πΈβ) and π₯β = (π,π,πΈβ) with the same π and π such that β(π₯β) β β(π₯β).
%β For every π β πΈβ β© πΈβ, dist(π, βπΈβ) = dist(π, βπΈβ).
%β
There exist π β πΈβ βͺ πΈβ such that dist(π, βπΈβ) = dist(π, βπΈβ).
%β
For any state with πΈ = πΈβ, β(π,π,πΈβ) β€ 1. And for any state with πΈ = πΈβ, β(π,π,πΈβ) β€ 1.
Multiple robots¶
Consider 10 point-sized mobile robots moving in a planar square environment
E = [-1,1]^2 \subset \mathbb{R}^2.
\\\
Each robot has no heading and no internal degrees of freedom. The state of robot π is given only by its position
p_i = (x_i,y_i) \in E, \quad \forall i \in \{1,\dots,10\}.
\\\
The state space of the system is denoted by π.
A virtual sensor is modeled as a mapping:
h : X \to Y,
\\\
where π is the observation space.
Let the detection region be a fixed set π β πΈ. Define the following three sensor mappings:
- Sensor ββ:
h_6(x) =
\begin{cases}
1, & \text{if } \,\, βi β \{1, . . . , 10\}\,|\, p_i \in V,\\
0, & \text{otherwise.}
\end{cases}
\\\
- Sensor ββ:
h_7(x) =
\begin{cases}
1, & \text{if }\, β i β \{1, . . . , 10\},\,\, p_i \in V,\\
0, & \text{otherwise.}
\end{cases}
\\\
- Sensor ββ:
h_8(x) = \bigl| \{ i β \{1,...,10\} \mid p_i \in V \} \bigr|.
\\\
%β π = βΒ².
%β π = βΒΉβ°.
%β π = βΒ³β°.
%β
π β βΒ²β°.
%β
π = πΈΒΉβ° = πΈ Γ πΈ Γ β― Γ πΈ (10 times).
%β Each state in π is a single point in the plane.
%β
Each state in π is a 10-tuple (πβ,β¦,πββ) with πα΅’ β πΈ.
%β The state space π depends on the detection region π.
%β
π β πΈ.
%β π = πΈ.
%β π β βΒ²β°.
%β
A robot is in the detection region if and only if its position lies in π.
%β
If a state π₯ β πΒΉβ°, then ββ(π₯) = ββ(π₯) = ββ(π₯) = 1.
%β
If a state π₯ β π Γ πΈβΉ, then at least one robot is guaranteed to be in π.
%β
ββ detects whether at least one robot lies in π.
%β ββ detects whether all robots lie in π.
%β ββ counts the number of robots that lie in π.
%β
ββ(π) = 1 if and only if exactly one robot lies in π.
%β
If a state π₯ β π Γ πΈβΉ, then ββ(π₯) = 1.
%β
If a state π₯ β πΒΉβ°, then ββ(π₯) = 1.
%β
ββ detects whether all robots lie in π.
%β ββ detects whether at least one robot lies in π.
%β ββ counts the number of robots that lie in π.
%βIf a state π₯ β π Γ πΈβΉ then ββ(π₯) = 1.
%β
If a state π₯ β πΒΉβ°, then ββ(π₯) = 1.
%β ββ(π) = 1 if and only if exactly one robot lies in π.
%β ββ(π) β {0,1} for all states.
%β If a state π₯ β π Γ πΈβΉ then ββ(π₯) = 1.
%β
If a state π₯ β πΒΉβ°, then ββ(π₯) = 10.
%βββ βΌ ββ.
%βββ βΌ ββ.
%βββ βͺ° ββ.
%βββ βͺ° ββ.
%β
ββ βͺ° ββ.
%β
ββ βͺ° ββ.
Authors¶
Anna LaValle.
Give feedback on this content
Comments about these exercises