-
Notifications
You must be signed in to change notification settings - Fork 1
Expand file tree
/
Copy pathindex.xml
More file actions
471 lines (471 loc) · 82 KB
/
index.xml
File metadata and controls
471 lines (471 loc) · 82 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Echospace @ UW</title><link>https://uw-echospace.github.io/</link><atom:link href="https://uw-echospace.github.io/index.xml" rel="self" type="application/rss+xml"/><description>Echospace @ UW</description><generator>Source Themes Academic (https://sourcethemes.com/academic/)</generator><language>en-us</language><lastBuildDate>Tue, 09 Sep 2025 00:00:00 +0000</lastBuildDate><item><title>Code of Conduct & What we value</title><link>https://uw-echospace.github.io/group/coc/</link><pubDate>Sun, 29 Sep 2024 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/group/coc/</guid><description><h3 id="why">Why</h3>
<ul>
<li>We respect and want to take advantage of the diversity of background and expertise in the group.</li>
<li>We also want to create an optimal learning and research environment for everyone.</li>
</ul>
<h3 id="code-of-conduct">Code of Conduct</h3>
<p>Echospace is dedicated to providing a harassment-free experience for everyone, regardless of gender, gender identity and expression, age, sexual orientation, disability, physical appearance, body size, race, or religion (or lack thereof). We do not tolerate harassment of group members in any form. Sexual language and imagery is not appropriate for any group venue, including meetings, presentations, or discussions, and in both physical and online spaces.</p>
<h3 id="what-we-value">What we value</h3>
<ul>
<li>Be proactive
<ul>
<li>Say hello to people you meet at work (in the morning when you come to work, etc)</li>
<li>Be on time and respect other’s time, communicate if you don’t have time</li>
<li>Don’t be afraid of being wrong and ask, and answer to questions from others kindly</li>
</ul>
</li>
<li>Be open to change</li>
<li>Communication
<ul>
<li>Respect other person’s background may be different from yours</li>
<li>Be welcoming to others’ perspectives and what makes them tick, what excites them</li>
<li>Don’t assume people know or don’t know what you’re talking about, it is ok and often better to just ask</li>
<li>When there are different opinions, try to convince others by reasoning, and avoid being dismissive because of others’ career stages or rank</li>
<li>Strive to communicate, both when we don’t think we understand what the other person is saying and when we think we are not being understood (misunderstandings)</li>
<li>Acknowledge contributions from others</li>
</ul>
</li>
<li>Ask questions and provide feedback kindly
<ul>
<li>Use “we” instead of “you”</li>
<li>Use sandwich method: start with positive statement, follow with the comment “have you consider XYZ”, and end with positive affirmation/offer support</li>
<li>Provide constructive feedback, elaborate on your comments, and be willing to explain more</li>
<li>Value listening to others, and asking respectful questions</li>
<li><strong>Provide positive feedback too!</strong></li>
</ul>
</li>
<li>Group interaction and collaboration
<ul>
<li>Have clear expectations and communicate openly, to avoid last minute surprises</li>
<li>Ensure everyone has the opportunity to participate both online and in person</li>
<li>Give opportunities to others to speak first</li>
<li>Make accommodations for personal emergencies</li>
</ul>
</li>
<li>Help each other grow
<ul>
<li>Provide opportunities for peer mentoring</li>
<li>Sharing opportunities in the group that others might be interested in (or would want to be involved in)</li>
<li>Enable and encourage growth by everyone in the group, in directions that align with what they’re interested in</li>
<li>Be comfortable bringing up issues or obstacles that hold you back from getting work done</li>
</ul>
</li>
</ul>
<h2 id="feedback-sharing">Feedback Sharing</h2>
<p>Provide feedback on what to improve either in person or use the
<a href="https://forms.gle/NZFZBQLk9BoMLZCV6" target="_blank" rel="noopener">Echospace Anonymous Feedback Form</a> if you prefer to remain anonymous. The form requires one to be logged with their <code>uw</code> email address, but we will not have access to it.</p>
<h2 id="conflict-resolution">Conflict Resolution</h2>
<p>If you see something inappropriate, let Wu-Jung or Valentina know immediately, or contact the
<a href="https://www.washington.edu/ombud/" target="_blank" rel="noopener">Office of the Ombud</a> for support in conflict resolution. You can refer to some additional resources from the UW HR office
<a href="https://hr.uw.edu/policies/complaint-resolution/" target="_blank" rel="noopener">here</a> or refer to your department for advice.</p></description></item><item><title>BOAT: Bridge to Ocean Acoustics and Technology</title><link>https://uw-echospace.github.io/project/boat/</link><pubDate>Thu, 28 Mar 2024 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project/boat/</guid><description><p>Ocean acoustics is an interdisciplinary field in which researchers focus on measuring, modeling, and understanding acoustic phenomena of oceanographical, geological, biological, and anthropogenic sources. However, despite its inherent interdisciplinary nature, Ocean acoustics research currently has limited presence in most US institutions and is typically viewed as a highly niched field.</p>
<p>The
<a href="https://boat-ocean-acoustics.github.io/" target="_blank" rel="noopener">Bridge to Ocean Acoustics and Technology (BOAT)</a> program aims to broaden access to ocean acoustics by empowering learners to explore and advance the field through collaboration and shared knowledge, focusing on:</p>
<ul>
<li>Developing open, executable, and web-hosted tutorials that encapsulate fundamental ocean acoustics knowledge and techniques as living documents.</li>
<li>Growing the ocean acoustics education and research community through interactive and collaborative workshops.</li>
</ul>
<p>In the current pilot phase, we will host two education workshops to provide a hands-on introduction to ocean acoustics—from fundamental concepts to real-world applications—using interactive
<a href="https://jupyter.org/" target="_blank" rel="noopener">Jupyter notebooks</a>. We will cover topics broadly applicable to both active and passive acoustics, with hands-on experience using echosounder and hydrophone datasets.</p>
<p>Our goal is to create open tutorials that can 1) be adapted to various educational settings, including regular university courses or summer workshops, and 2) serve as blueprints for further developing in-depth resources on specific ocean acoustics topics.</p>
<p>See the
<a href="https://boat-ocean-acoustics.github.io/" target="_blank" rel="noopener">BOAT website</a> for information for the two upcoming workshops in
<a href="https://boat-ocean-acoustics.github.io/workshop_seattle.html" target="_blank" rel="noopener">Seattle</a> and
<a href="https://boat-ocean-acoustics.github.io/workshop_new_orleans.html" target="_blank" rel="noopener">New Orleans</a>!</p>
<p><strong>Funding Agency:</strong> Office of Naval Research, Ocean Acoustics Program</p></description></item><item><title>Analyzing the effects of environmental conditions on bat activity</title><link>https://uw-echospace.github.io/project_others/2024-ubna-weather/</link><pubDate>Fri, 15 Mar 2024 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project_others/2024-ubna-weather/</guid><description><p>As global warming accelerates and continues to reshape regional climates, changes in local weather patterns, such as rising temperatures, irregular rainfalls, and random forest fires, can profoundly alter bat behavior, foraging patterns, and migration timing, affecting the overall health of the entire bat communities. Therefore, understanding which environmental conditions have an impact on bat activity, and to what extent, is critical to developing effective conservation strategies in one region.</p>
<p>With the Union Bay Natural Area at the University of Washington as a research site, using bat echolocation call detections from the
<a href="https://uw-echospace.github.io/project/2022-ubna-pam/" target="_blank" rel="noopener">UBNA passive acoustic monitoring program</a>, meteorological data from the University of Washington weather station, and lunar phase data from NASA, the project aims to find the environmental factors that influence bat activity and explore the effects of extreme stormy weather on local bat emergence through statistical testing and inference.</p>
<p>Funding:
<a href="https://www.washington.edu/research/or/royalty-research-fund-rrf/" target="_blank" rel="noopener">UW Royalty Research Fund</a></p></description></item><item><title>UBNA passive acoustic monitoring project</title><link>https://uw-echospace.github.io/project_others/2022-ubna-pam/</link><pubDate>Wed, 01 Sep 2021 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project_others/2022-ubna-pam/</guid><description><p>Understanding how echolocators make use of sound to navigate their surroundings has great potential for informing the design of advanced acoustic sensing technologies. Although much has already been studied in laboratory experiments, technology has just started catching up to allow researchers to study echolocation-related processes in the wild. Advancements in passive acoustic monitoring tools have made it affordable to conduct long-term acoustic surveys on animals in the wild. Echolocators, like bats, are well-suited for monitoring using simple passive acoustic techniques because of their use of acoustics and navgitaion in air.</p>
<p>In this project, we sought to collect long-term acoustic data using Audiomoths from an urban natural area called the Union Bay Natural Area at the University of Washington. With this data, we hope to uncover questions on how environmental conditions influence how bats choose to forage.</p>
<p><strong>Funding agency</strong>:
<a href="https://www.washington.edu/research/or/royalty-research-fund-rrf/" target="_blank" rel="noopener">University of Washington Royalty Research Fund</a></p></description></item><item><title>Passive acoustic monitoring in the Union Bay Natural Area</title><link>https://uw-echospace.github.io/project/ubna-pam/</link><pubDate>Wed, 01 Sep 2021 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/project/ubna-pam/</guid><description><p>Passive acoustic monitoring (PAM) has become a useful technique for monitoring soniferous animals in both terrestrial and marine habitats, and has in recent years been particularly bolstered by the broader availability and accessibility of low-cost recording devices.</p>
<p>In this project, we deploy low-cost
<a href="https://www.openacousticdevices.info/audiomoth" target="_blank" rel="noopener">AudioMoth</a> recorders at multiple sites in the
<a href="https://botanicgardens.uw.edu/center-for-urban-horticulture/visit/union-bay-natural-area/" target="_blank" rel="noopener">Union Bay Natural Area</a>, right on the eastern edge of the University of Washington, Seattle campus. Since the project began in fall 2021, we have collected over 30 TB of recordings that are embedded with sounds from a wide variety of animals (e.g., birds, bats, frogs) and anthopogenic sources (e.g., airplanes, football stadium roars).</p>
<p>Beyond generating a rich dataset, the project fieldwork and data analysis provides an accesible entry point for students to engage in real-world bioacoustics research, with hands-on data science and instrumentation opportunities.</p>
<p>We have leveraged this dataset to investigate the
<a href="https://uw-echospace.github.io/talk/202405-aditya-duty-cycle/">impact of duty cycle recording on bat monitoring</a> and support multiple capstone projects in the
<a href="https://www.washington.edu/datasciencemasters/" target="_blank" rel="noopener">UW Data Sciene Master&rsquo;s Program</a>, focused on developing open-source, cloud-hosted workflows for analyzing large PAM datasets. With these tools in place, going forward we plan to characterize the seasonal soundscape fluctuations in UBNA with respect to weather/climate events, and find opportunities to expand this effort to a community monitoring program in the Greater Seattle area.</p>
<p>
<img src="featured.png" alt="UBNA bat activity grid" style="width:600px"/>
<b>Bat call activity detected in two UBNA sites in 2022.</b>
</p>
<p><strong>Funding</strong>:
<a href="https://www.washington.edu/research/or/royalty-research-fund-rrf/" target="_blank" rel="noopener">UW Royalty Research Fund</a></p></description></item><item><title>Machine learning in fisheries acoustics</title><link>https://uw-echospace.github.io/project/hake-ml/</link><pubDate>Sat, 01 May 2021 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project/hake-ml/</guid><description><p>
<a href="https://storymaps.arcgis.com/stories/e245977def474bdba60952f30576908f" target="_blank" rel="noopener">Active acoustic data collected by echosounders</a> (high-frequency sonar systems) play a crucial role in marine ecological research and fisheries stock assessments. Recent technical advancements has further integrated echosounders onto many ocean observing platforms, leading to the rapid accumulation of echosounder data worldwide.</p>
<p>In this project, we tackle the challenge of translating experiences from human experts into machine learning models capable of efficiently extracting biological information from large echosounder dataset. Using the rich dataset collected by the
<a href="https://www.fisheries.noaa.gov/west-coast/science-data/joint-us-canada-integrated-ecosystem-and-pacific-hake-acoustic-trawl-survey" target="_blank" rel="noopener">Joint U.S.-Canada Integrated Ecosystem and Pacific Hake Acoustic Trawl Survey</a> dated back to early 2000s, we are developing deep learning models to automatically annotate echograms—color-coded visual representations of echo returns—with the presence of specific fish and zooplankton species or taxa.</p>
<p>In the first stage of the project, we are focusing on developing
<a href="https://uw-echospace.github.io/talk/202405-asa-ottawa-hake/">an echogram segmentation model to identify Pacific hake</a>, a keystone species and the largest fishery stock on the west coast of the US. Identifying hake on echograms is more challenging compared to many other fish species, due to their polymorphic appearance and diffused school boundaries. We found that neural networks' large learning capacity are well-suited to address these complexities. However, as in many other domains, organizing echosounder data with survey metadata and sorting expert annotations remains a significant bottleneck in fully leveraging these technologies.</p>
<p>Moving forward, we aim to expand the model to include other ecologically and commercially important fish species in the California Current ecosystem, and incorporate other analytical methods, such as Bayesian inversion techniques, to improve acoustic data interpretation and biomass estimation accuracy.</p>
<!-- To take full advantage of these large and complex new datasets, in this project we aim to combine the development of machine learning methodology with a cloud-based workflow to accelerate the extraction of biological information from fisheries acoustic data. Our group has developed and used [Echopype](https://echopype.readthedocs.io/en/stable/), a Raw Sonar Backscatter data parsing Python package, and [Echoregions](https://echoregions.readthedocs.io/en/latest/), an Echoview annotation data parsing Python package. Transferring data from Echoview and proprietary echosounder formats to Python data products enables seamless integration with a rich ecosystem of scientific computing tools developed by a vast community of open-source contributors, thus allowing us to use our data to train deep learning models to predict regions of interest in echograms.
<img src="featured.png" alt="Fisheries Acoustics"> -->
<p>
<img src="featured.png" alt="Echogram examples showing the deep learning model predicts regions similar to human expert annotations." style="width:1000px"/>
<b>Echogram examples showing the deep learning model predicts regions similar to human expert annotations.</b>
</p>
<p>This project is in close collaboration with the
<a href="https://www.fisheries.noaa.gov/west-coast/sustainable-fisheries/fisheries-engineering-and-acoustic-technologies-team" target="_blank" rel="noopener">Fisheries Engineering and Acoustics Technology (FEAT) team</a> at the NOAA Fisheries
<a href="https://www.fisheries.noaa.gov/about/northwest-fisheries-science-center" target="_blank" rel="noopener">Northwest Fisheries science center (NWFSC)</a>.</p>
<p><strong>Funding agency</strong>: NOAA Fisheries</p></description></item><item><title>Scalable, cloud-native processing of water column sonar data</title><link>https://uw-echospace.github.io/project_others/2021-cloud-workflows/</link><pubDate>Sun, 24 Oct 2021 16:56:36 -0800</pubDate><guid>https://uw-echospace.github.io/project_others/2021-cloud-workflows/</guid><description><p>Scientists commonly use active sonar systems to collect data about mid-trophic level animals like zooplankton and small fish, which play an important role in the marine ecosystems. Echosounders, or fish-finders, are high-frequency sonar systems that emit pulses of sound and record the reflections from animals, the seabed, and other objects. These instruments have been proven to be more efficient and effective for collecting data over a large survey area or a long time period than many other sampling methods, such as underwater imaging and net trawls. This technology has been widely adopted by the ocean science and commercial fishing communities and more recently has been integrated with autonomous vehicles, resulting in a massive amount of data. However, these datasets can be difficult to analyze and are often underutilized. We will address this issue by adopting and advancing data standards, developing a streamlined data processing workflow, and integrating open-source software tools that capitalize on recent advancements in cloud computing technologies to efficiently transform large quantities of ocean sonar data into information that is useful for exploring, monitoring, and managing living marine resources.</p>
<p><strong>Funding agency</strong>: NOAA Office of Ocean Exploration and Research
<a href="https://oceanexplorer.noaa.gov/news/oer-updates/2021/fy21-ffo-schedule.html" target="_blank" rel="noopener">FY2021 grants</a></p></description></item><item><title>The open-source "Echostack" for flexible and scalable echosounder data processing</title><link>https://uw-echospace.github.io/project/echostack/</link><pubDate>Thu, 01 Jul 2021 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/project/echostack/</guid><description><p>
<a href="https://storymaps.arcgis.com/stories/e245977def474bdba60952f30576908f" target="_blank" rel="noopener">Water column sonar data collected by echosounders</a> are essential for fisheries and marine ecosystem research, enabling the detection, classification, and quantification of fish and zooplankton from many different ocean observing platforms. However, the broad usage of these data has been hindered by the lack of modular software tools that allow flexible composition of data processing workflows that incorporate powerful analytical tools in the scientific Python ecosystem.</p>
<p>We address this gap by developing <strong>Echostack</strong>, a suite of open-source Python software packages that leverage existing distributed computing and cloud-interfacing libraries to support intuitive and scalable data access, processing, and interpretation. These tools can be used individually or orchestrated together, which we demonstrate in example use cases for a fisheries acoustic-trawl survey.</p>
<p>Below is a summary of the Echostack packages:</p>
<p>
<img src="echostack_summary.png" alt="Echostack summary" style="width:1000px"/>
</p>
<p>For more information, check out the code repositories below:</p>
<ul>
<li>
<a href="https://github.com/OSOceanAcoustics/echopype" target="_blank" rel="noopener">Echopype</a>
<ul>
<li>Check out the our
<a href="https://doi.org/10.1093/icesjms/fsae133" target="_blank" rel="noopener">Echopype paper</a> in the ICES Journal of Marine Science</li>
</ul>
</li>
<li>
<a href="https://github.com/OSOceanAcoustics/echopop" target="_blank" rel="noopener">Echopop</a>
<ul>
<li>Learn more on the
<a href="../../project_others/echopop/">Echopop project page</a></li>
</ul>
</li>
<li>
<a href="https://github.com/OSOceanAcoustics/echoshader" target="_blank" rel="noopener">Echoshader</a></li>
<li>
<a href="https://github.com/OSOceanAcoustics/echoregions" target="_blank" rel="noopener">Echoregions</a></li>
<li>
<a href="https://github.com/OSOceanAcoustics/echodataflow" target="_blank" rel="noopener">Echodataflow</a></li>
</ul>
<p>These packages are accompanied by a set of
<a href="https://github.com/OSOceanAcoustics/echolevels" target="_blank" rel="noopener">Echolevels</a> that categorize data products at different workflow stages to enhance data understanding and provenance tracking.</p>
<p>Check out Wu-Jung&rsquo;s talk at SciPy 2024 and the associated
<a href="https://doi.org/10.25080/WXRH8633" target="_blank" rel="noopener">paper</a> in the proceedings!
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/YRFxMGisGww" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen title="YouTube Video"></iframe>
</div>
</p>
<p><strong>Funding</strong>:</p>
<ul>
<li>NOAA Fisheries</li>
<li>NOAA Office of Ocean Exploration and Research
<a href="https://oceanexplorer.noaa.gov/news/oer-updates/2021/fy21-ffo-schedule.html" target="_blank" rel="noopener">FY2021 grants</a></li>
</ul></description></item><item><title>Echopop: biomass estimation for Pacific Hake</title><link>https://uw-echospace.github.io/project_others/echopop/</link><pubDate>Tue, 25 Jan 2022 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project_others/echopop/</guid><description><p>Echopop is a software used for processing backscatter measurements and biological data collected from acoustic-trawl surveys to estimate population estimates and other metrics. The development of this software has been primarily focused on surveys targeting Pacific hake (<strong>see below information for more details</strong>), but the goal is to generalize the software in the future for broader fisheries community use.</p>
<p>The
<a href="https://www.fisheries.noaa.gov/west-coast/sustainable-fisheries/fisheries-engineering-and-acoustic-technologies-team" target="_blank" rel="noopener">Fisheries Engineering and Acoustics Technology (FEAT) team</a> at the NOAA Fisheries
<a href="https://www.fisheries.noaa.gov/about/northwest-fisheries-science-center" target="_blank" rel="noopener">Northwest Fisheries Science Center (NWFSC)</a> collaborates with
<a href="https://www.dfo-mpo.gc.ca/index-eng.htm" target="_blank" rel="noopener">Fisheries and Oceans Canada (DFO) - Pacific Region</a> to estimate total biomass of
<a href="https://www.fisheries.noaa.gov/species/pacific-whiting" target="_blank" rel="noopener">Pacific hake (<i>Merluccius productus</i>)</a> by incorporating acoustic and biological trawl data from the
<a href="https://www.fisheries.noaa.gov/west-coast/science-data/joint-us-canada-integrated-ecosystem-and-pacific-hake-acoustic-trawl-survey" target="_blank" rel="noopener">Joint U.S.-Canada Integrated Ecosystem and Pacific Hake Acoustic-Trawl Survey</a> (aka the &ldquo;Hake survey&rdquo;).</p>
<p>These biomass estimates are the inputs for the stock assessment of hake and need to be completed in an efficient and timely manner after the survey. The biomass estimates are currently produced by a suite of Matlab scripts operated by a single user, and the analysis procedures are not easily adaptable by other FEAT/DFO team members. The central objective of this project is to provide a well-documented open-source
<a href="https://github.com/OSOceanAcoustics/echopop" target="_blank" rel="noopener">Python software package</a> (<code>echopop</code>) that contains the core computational functionality of the current Matlab EchoPro program and provides basic visualization of the analysis results.</p>
<p>The new software package (currently
<a href="https://github.com/OSOceanAcoustics/echopop/releases/latest" target="_blank" rel="noopener">version 0.4.0</a> and available on
<a href="https://pypi.org/project/echopop/" target="_blank" rel="noopener">PyPi</a>) contains an expanded
<a href="https://echopop.readthedocs.io/en/latest/" target="_blank" rel="noopener">documentation</a> that details the underlying theory and algorithmic implementation that help facilitate reproducibility. Other features include an
<a href="https://echopop.readthedocs.io/en/latest/api.html" target="_blank" rel="noopener">Application Programming Interface (API)</a> that can be invoked in a reproducible manner, a flexible analysis configuration that allows for both machine and human-readable parameterizations, and interactive Jupyter notebooks that exemplify various workflows
<a href="https://echopop.readthedocs.io/en/latest/example_notebooks/example_echopop_workflow.html" target="_blank" rel="noopener">ranging from initial data processing to kriging</a>.</p>
<p><strong>Funding agency</strong>: NOAA Fisheries, NOAA NWFSC</p></description></item><item><title>Opportunities at Echospace</title><link>https://uw-echospace.github.io/position/opportunities/</link><pubDate>Sun, 01 Sep 2024 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/position/opportunities/</guid><description><h2 id="general-information">General information</h2>
<p>We welcome researchers and students with diverse backgrounds to come work with us in Echospace! If you don&rsquo;t see a specific position below, feel free to reach out to us directly. Please include a cover letter and a CV/resume when initiating the contact, so that we have a better idea what you are looking for and your background.</p>
<h2 id="fellowship-opportunities">Fellowship opportunities</h2>
<p>Below is a list of fellowship opportunities within and outside of UW. Relevant areas include but are not limited to fisheries and ocean acoustics, animal echolocation (bats and dolphins), marine ecology, and environmental data science. Feel free to reach out to us for questions and discussion.</p>
<h3 id="postdoc-fellowships">Postdoc fellowships</h3>
<h4 id="university-of-washington">University of Washington</h4>
<ul>
<li>
<a href="https://ap.washington.edu/ahr/position-details/?job_id=99111" target="_blank" rel="noopener">APL SEED Postdoctoral Scholar Program</a></li>
<li>
<a href="https://cicoes.uw.edu/education/postdoc-program/" target="_blank" rel="noopener">Cooperative Institute for Climate, Ocean, and Ecosystem Studies (CICOES) postdoctoral fellowship</a></li>
<li>
<a href="https://www.wrfseattle.org/grants/wrf-postdoctoral-fellowships/" target="_blank" rel="noopener">Washington Research Foundation (WRF) Postdoctoral Fellowship</a></li>
</ul>
<h4 id="external">External</h4>
<ul>
<li>
<a href="https://beta.nsf.gov/funding/opportunities/ocean-sciences-postdoctoral-research-fellowships-oce-prf-0" target="_blank" rel="noopener">NSF Ocean Sciences (OCE) Postdoctoral Research Fellowships (OCE-PRF)</a></li>
<li>
<a href="https://beta.nsf.gov/funding/opportunities/ear-postdoctoral-fellowships-ear-pf" target="_blank" rel="noopener">NSF Division of Earth Sciences (EAR) Postdoctoral Fellowships (EAR-PF)</a></li>
<li>
<a href="https://acousticalsociety.org/fellowships-and-scholarships/" target="_blank" rel="noopener">Acoustical Society of America (ASA) F. V. Hunt Postdoctoral Research fellowship</a></li>
</ul>
<h3 id="graduate-fellowships">Graduate fellowships</h3>
<ul>
<li>
<a href="https://www.nsfgrfp.org/" target="_blank" rel="noopener">NSF Graduate Research Fellowships Program (GRFP)</a></li>
</ul>
<h3 id="undergraduate-fellowships">Undergraduate fellowships</h3>
<ul>
<li>
<a href="https://expd.uw.edu/mge/apply/research/" target="_blank" rel="noopener">UW Mary Gates Research Scholarship</a></li>
<li>
<a href="https://www.apl.uw.edu/education/dino_sip.php" target="_blank" rel="noopener">APL DINO-SIP Diverse + Inclusive Naval Oceanographic Summer Intership Program</a></li>
</ul></description></item><item><title>ADCP-equipped underwater glider as a distributed biological sensing tool</title><link>https://uw-echospace.github.io/project/2020-glider-adcp/</link><pubDate>Tue, 01 Sep 2020 17:02:36 -0800</pubDate><guid>https://uw-echospace.github.io/project/2020-glider-adcp/</guid><description><p>Mid-trophic level animals, such as zooplankton and fish, are keystone organisms in the marine ecosystem and play a critical role in the economy and our food supply chain. However, our understanding of these animals, particularly those in the pelagic zones, is severely limited, due to the lack of tools that cab . This gap of knowledge has greatly impeded our ability in making informed policy decisions to support sustainable resource management. The root cause of this problem is the lack of tools that can collect information about these animals at large temporal and spatial scales comparable to other physical, chemical, and lower-trophic biological (e.g., chlorophyll) oceanographic variables.</p>
<p>Gliders have provided unparalleled mobile, persistent access to deep, remote ocean environments at a fraction of the cost of a research vessel. Taking advantage of this unique capability, in this project we aim to develop sampling strategies and data analysis methodologies to enable distributed long-term observation of mid-trophic marine organisms using
<a href="https://apl.uw.edu/project/project.php?id=seaglider_auv" target="_blank" rel="noopener">Seagliders</a> equipped with acoustic Doppler current profilers (ADCPs).</p>
<p><strong>Funding agency</strong>: NOAA Office of Ocean Exploration and Research
<a href="https://oceanexplorer.noaa.gov/news/oer-updates/2020/fy20-ffo-schedule.html" target="_blank" rel="noopener">FY2020 grants</a></p></description></item><item><title>Computing startup resources</title><link>https://uw-echospace.github.io/group/compute_docs/</link><pubDate>Thu, 07 Dec 2023 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/group/compute_docs/</guid><description><p>In this collection we share useful starting computing resources among echospace group members.</p>
<ul>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-conda-jupyter.html" target="_blank" rel="noopener">Conda and Jupyter</a></li>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-git.html" target="_blank" rel="noopener">Git and GitHub</a></li>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-cloud.html" target="_blank" rel="noopener">Cloud computing</a></li>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-hpc.html" target="_blank" rel="noopener">HPC and SLURM</a></li>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-osn.html" target="_blank" rel="noopener">OSN data access</a></li>
<li>
<a href="https://echospace-group-docs.readthedocs.io/en/latest/compute-others.html" target="_blank" rel="noopener">Other topics to be covered</a></li>
</ul></description></item><item><title>Echopype</title><link>https://uw-echospace.github.io/project_others/echopype/</link><pubDate>Sun, 01 Nov 2020 14:50:00 -0800</pubDate><guid>https://uw-echospace.github.io/project_others/echopype/</guid><description><p><strong><code>Echopype</code></strong> is an open-source Python package aimed at
<a href="https://echopype.readthedocs.io/en/latest/why.html" target="_blank" rel="noopener">enhancing the interoperability and scalability</a>
in processing ocean sonar data for biological information.</p>
<p>I started building this package in early 2018 when I couldn&rsquo;t find an affordable tool that
allow easy access and manipulation of echosounder data collected by
different sonar models.</p>
<p>For the latest updates, check out our repo at: <a href="https://github.com/OSOceanAcoustics/echopype">https://github.com/OSOceanAcoustics/echopype</a>.</p>
<p>Check out my talk at SciPy 2019 that discussed the goals and philosophy of echopype:
<div style="position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;">
<iframe src="https://www.youtube.com/embed/qboH7MyHrpU" style="position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;" allowfullscreen title="YouTube Video"></iframe>
</div>
</p></description></item><item><title>Pattern discovery from long-term echosounder time series</title><link>https://uw-echospace.github.io/project/2019-ooi-mtx-decomp/</link><pubDate>Tue, 01 Jan 2019 14:50:00 -0800</pubDate><guid>https://uw-echospace.github.io/project/2019-ooi-mtx-decomp/</guid><description><p><strong>Funding agency</strong>: National Science Foundation
<a href="https://www.nsf.gov/awardsearch/showAward?AWD_ID=1849930&amp;HistoricalAwards=false" target="_blank" rel="noopener">Award #1849930</a></p></description></item><item><title>Echo Statistics</title><link>https://uw-echospace.github.io/project/echo-stat-tutorial/</link><pubDate>Sat, 01 Dec 2018 00:00:00 -0800</pubDate><guid>https://uw-echospace.github.io/project/echo-stat-tutorial/</guid><description><p>For a 2018 tutorial I published with
<a href="https://www2.whoi.edu/staff/tstanton/" target="_blank" rel="noopener">Tim Stanton</a> and
<a href="https://www.linkedin.com/in/kyungmin-baik-098156149/" target="_blank" rel="noopener">Kyungmin Baik</a> in
the Journal of the Acoustical Society of America (JASA):</p>
<p><strong>Echo statistics associated with discrete scatterers: A tutorial on physics-based methods</strong>. JASA 144(6): 3124–3171; <a href="https://doi.org/10.1121/1.5052255">https://doi.org/10.1121/1.5052255</a></p>
<p>we provided the Matlab code to reproduce all figures in two forms:</p>
<ul>
<li>a <em>frozen</em> version archived with the paper, and</li>
<li>a GitHub
<a href="https://github.com/leewujung/echo-stats-tutorial" target="_blank" rel="noopener">repository</a> minted with a
<a href="https://doi.org/10.5281/zenodo.2458776" target="_blank" rel="noopener">DOI from Zenodo</a>.</li>
</ul>
<p>This way we can keep the code &ldquo;alive&rdquo; on GitHub but also has a convenient snapshot of the code at the time of the tutorial publication.</p></description></item><item><title>Modeling sound propagation in the head of toothed whales</title><link>https://uw-echospace.github.io/project/echolocation-comsol/</link><pubDate>Fri, 01 Jul 2022 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/project/echolocation-comsol/</guid><description><p>Toothed whales, including species such as porpoises, dolphins, orca, and sperm whale, possess highly specialized anatomical structures in the head to support their biosonar systems - echolocation - through millions of years of evoluation. These animals have the remarkable ability to detect and track small targets over long distance and discriminate between minute differences between targets using echolocation, with performance often surpassing that of current human-made sonar systems. However, many questions remain in how exactly the unusual anatomical structures in the head of toothed whales are orchestrated to support such performance.</p>
<p>As part of a Multidisciplinary University Research Initiative (MURI) project, we use finite element modeling techniques in combination with volumetric representations derived from computed tomography (CT) scans to predict the head-related transfer functions (HRTFs) of a dolphin head. The HRTFs summarizes the influence of the head to sounds propagating to the ears. We use HRTFs as a biologically meaningful proxy to provide a physics-based mechanistic understanding of the sound transduction processes.</p>
<!-- TODO: link ASA 2023 talk -->
<p><strong>Funding</strong>: Office of Naval Research, Multidisciplinary University Research Initiative (MURI) program</p></description></item><item><title>Target search and discrimination by echolocating toothed whales</title><link>https://uw-echospace.github.io/project/echolocation-search/</link><pubDate>Thu, 01 Jan 1970 00:33:38 +0000</pubDate><guid>https://uw-echospace.github.io/project/echolocation-search/</guid><description><p>Echolocating animals effortlessly navigate, hunt, and interact with their environment, despite cluttered and noisy return signals. Blind expert human echolocators prove that this capacity does not depend exclusively on biological specializations unique to particular species. This project is an integrated component of a larger collaborative Multidisciplinary University Research Initiative (MURI) project focused on active sensing in echolocating marine mammals and humans. The MURI team use both toothed whales (odontocetes) and humans as model systems to identify the neural mechanisms that extract echo-acoustic information and the brain networks that build and learn robust, invariant representations of auditory objects in complex auditory scenes.</p>
<p>In Echospace, we undertake two interconnected components of this project:</p>
<ul>
<li>Model the echolocation-based target search by toothed whales as an information-seeking behavior by extending the <em>infotaxis</em> algorithm originally formulated in moth odor tracking problems into an <em>active sensing</em> context</li>
</ul>
<!-- TODO: link ASA 2019 talk -->
<ul>
<li>Conduct and analyze the coupled acoustic sampling and movement behaviors of an echolocating harbor porpoise in a target discrimination experiment</li>
</ul>
<!-- TODO: link ASA 2021, 2023 talks -->
<p><strong>Funding agency</strong>: Office of Naval Research, Multidisciplinary University Research Initiative (MURI) program</p></description></item><item><title>Communication within group</title><link>https://uw-echospace.github.io/group/communication/</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/group/communication/</guid><description><h2 id="mode-of-communication">Mode of Communication</h2>
<p>We use various modes of communication in Echospace to keep ourselves up-to-date on what we are each working on and coordinate.</p>
<p>Depending on who&rsquo;s involved and the context, we use different methods to communicate. In general:</p>
<ul>
<li>Internal (among group members): mostly Slack</li>
<li>External (with colleagues outside of the group): emails</li>
<li>GitHub: PR and issues, see
<a href="./compute-git.md">here</a> for how to get started</li>
<li>Talking: we do talk in analog form!</li>
</ul>
<h2 id="expectations">Expectations</h2>
<ul>
<li>We encourage proactive and frequent communication; for most projects we meet at least weekly to keep each other updated and set short and long term goals</li>
<li>Immediate response are not expected unless urgent</li>
<li>People may send messages/emails at their convenient time; aim to respond within a reasonable time frame</li>
<li>Phone/text: usually reserved for urgent communication or offsite coordination</li>
<li>Ask us/everyone for help, and provide help if you can!</li>
</ul>
<h2 id="slack-workspace">Slack workspace</h2>
<ul>
<li>Default channels you&rsquo;re added to:
<ul>
<li>#general</li>
<li>#help</li>
<li>#random</li>
</ul>
</li>
<li>Feel free to:
<ul>
<li>Add or remove yourself from channels, but make sure you stay in the know for your projects, as well as group announcements</li>
<li>Create new channels and announce in the #general channel for others to join</li>
</ul>
</li>
</ul></description></item><item><title>Influence of duty-cycle recording on measuring bat activity in passive acoustic monitoring</title><link>https://uw-echospace.github.io/publication/2025-krishna-lee-jasa-dutycycle/</link><pubDate>Tue, 09 Sep 2025 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2025-krishna-lee-jasa-dutycycle/</guid><description/></item><item><title>Head-related transfer function predictions reveal dominant sound propagation mechanisms to the dolphin ears</title><link>https://uw-echospace.github.io/publication/2025-cheong-etal-jasa-hrtf/</link><pubDate>Tue, 08 Jul 2025 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2025-cheong-etal-jasa-hrtf/</guid><description/></item><item><title>Movement trajectories reflect active information acquisition by an echolocating porpoise in a target discrimination task</title><link>https://uw-echospace.github.io/publication/2025-lee-etal-jasa-porpoise-movement/</link><pubDate>Mon, 07 Jul 2025 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2025-lee-etal-jasa-porpoise-movement/</guid><description/></item><item><title>Scientific Software Engineering Opportunity</title><link>https://uw-echospace.github.io/position/2025-03-01-rse/</link><pubDate>Sat, 01 Mar 2025 00:00:00 -0800</pubDate><guid>https://uw-echospace.github.io/position/2025-03-01-rse/</guid><description><h2 id="open-source-software-to-accelerate-marine-ecosystem-research">Open-Source Software to Accelerate Marine Ecosystem Research</h2>
<p>The
<a href="https://uw-echospace.github.io/" target="_blank" rel="noopener">Echospace</a> research group at the University of Washington, Seattle is seeking a highly motivated individual to contribute to the development of cutting-edge software tools for marine ecosystem research. This position offers the opportunity to work on &ldquo;
<a href="https://doi.org/10.25080/WXRH8633" target="_blank" rel="noopener">Echostack</a>&rdquo;, a suite of open-source Python software packages designed to process and analyze large-scale echosounder data.</p>
<p>Echosounders are high-frequency ocean sonar systems widely used to
<a href="https://storymaps.arcgis.com/stories/e245977def474bdba60952f30576908f" target="_blank" rel="noopener">study life in the ocean</a>. By transmitting sounds and analyzing the echoes bounced off fish and zooplankton, echosounders “image” the underwater world much like how medical ultrasound images the interior of the human body, providing crucial insights into marine ecosystems. By joining this project, you will play a key role in making echosounder data more accessible and useful for researchers worldwide.</p>
<h3 id="what-youll-do">What you’ll do</h3>
<ul>
<li>Design, develop, and test open-source software for processing and analyzing large volumes of echosounder data</li>
<li>Optimize and benchmark software performance on local computers and cloud virtual machines</li>
<li>Process and analyze archived echosounder data spanning over 20 years, collected by NOAA and other US institutions</li>
<li>Build and manage near real-time data workflows from research cruises</li>
</ul>
<h3 id="who-were-looking-for">Who we’re looking for</h3>
<p>We welcome applications from recent graduates, post-baccalaureate researchers, advanced undergraduates, or current master’s students with the following qualifications:</p>
<ul>
<li>A solid foundation in software engineering</li>
<li>Proficiency with object-oriented programming, particularly in Python</li>
<li>Experience with or a strong interest in large datasets and distributed computing</li>
<li>Completion of a college level linear algebra course</li>
<li>Commitment to learn echosounder data processing procedures and data structures</li>
<li>A strong interest in ocean science, environmental science, or related fields</li>
<li>Excellent communication skills to work effectively in an interdisciplinary team of researchers with backgrounds in engineering, mathematics, and biology</li>
</ul>
<p>Note this position is intended for early career professionals or students. The initial appointment will be 3-6 months with a possibility of extension. The exact position setup will vary depending on your status.</p>
<h3 id="why-join-us">Why join us?</h3>
<p>This is an excellent opportunity to:</p>
<ul>
<li>Develop open-source software that directly contributes to ocean research</li>
<li>Gain experience in scientific computing, cloud computing, and environmental data science</li>
<li>Work closely with researchers at the University of Washington, NOAA, and other institutions</li>
<li>Build your technical portfolio and establish connections in the ocean technology and research community</li>
</ul>
<h3 id="resources">Resources</h3>
<ul>
<li>UW Echospace: <a href="https://uw-echospace.github.io/">https://uw-echospace.github.io/</a></li>
<li>Echostack Talk at SciPy 2024:
<a href="https://youtu.be/YRFxMGisGww" target="_blank" rel="noopener">Watch here</a></li>
<li>Echostack Paper:
<a href="https://doi.org/10.25080/WXRH8633" target="_blank" rel="noopener">Read here</a></li>
<li>Introduction to echosounder data:
<a href="https://storymaps.arcgis.com/stories/e245977def474bdba60952f30576908f" target="_blank" rel="noopener">StoryMap</a></li>
</ul>
<h3 id="how-to-apply">How to Apply</h3>
<p>To apply, email Dr. Wu-Jung Lee (<a href="mailto:leewj@uw.edu">leewj@uw.edu</a>) and Dr. Valentina Staneva (<a href="mailto:vms16@uw.edu">vms16@uw.edu</a>) with the following:</p>
<ul>
<li>A cover letter detailing:
<ul>
<li>Why you believe you are a good fit for this role</li>
<li>How this position aligns with your career goals or research interests</li>
<li>Any relevant prior experiences</li>
<li>Timeframe and capacity (part-time/full-time) you are available for this position</li>
</ul>
</li>
<li>A CV or resume that includes a link to your GitHub account (or equivalent portfolio) showcasing your previous software work.</li>
</ul>
<p>If your past contributions are only available in private repositories, please provide a detailed project description or create a public example that highlights your skills.</p></description></item><item><title>Interoperable and scalable echosounder data processing with Echopype</title><link>https://uw-echospace.github.io/publication/2024-lee-etal-echopype/</link><pubDate>Wed, 09 Oct 2024 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2024-lee-etal-echopype/</guid><description/></item><item><title>Echostack: An open-source Python software toolbox that democratizes water column sonar dataand processing</title><link>https://uw-echospace.github.io/talk/202407-scipy-echostack/</link><pubDate>Thu, 11 Jul 2024 14:20:00 -0700</pubDate><guid>https://uw-echospace.github.io/talk/202407-scipy-echostack/</guid><description/></item><item><title>Investigation of duty cycles for measuring activity in passive acoustic bat monitoring</title><link>https://uw-echospace.github.io/talk/202405-aditya-duty-cycle/</link><pubDate>Tue, 14 May 2024 10:10:00 -0100</pubDate><guid>https://uw-echospace.github.io/talk/202405-aditya-duty-cycle/</guid><description/></item><item><title>Variability and influence of fisheries acoustic echogram annotations on machine learning applications</title><link>https://uw-echospace.github.io/talk/202405-asa-ottawa-hake/</link><pubDate>Mon, 13 May 2024 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/talk/202405-asa-ottawa-hake/</guid><description/></item><item><title>Two Pieces of the Same Puzzle: Active and Passive Acoustics for Cross-Trophic Marine Ecosystem Monitoring Part II</title><link>https://uw-echospace.github.io/talk/202405-asa-ottawa-school/</link><pubDate>Sun, 12 May 2024 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/talk/202405-asa-ottawa-school/</guid><description/></item><item><title>A ship-to-cloud machine learning pipeline built on the open-source Python Echostack software tools</title><link>https://uw-echospace.github.io/talk/202404-wgfast-ship2cloud/</link><pubDate>Thu, 11 Apr 2024 09:00:00 -0500</pubDate><guid>https://uw-echospace.github.io/talk/202404-wgfast-ship2cloud/</guid><description/></item><item><title>Scalable and configurable echosounder data workflows</title><link>https://uw-echospace.github.io/talk/202404-wgfast-echodataflow/</link><pubDate>Thu, 11 Apr 2024 09:00:00 -0500</pubDate><guid>https://uw-echospace.github.io/talk/202404-wgfast-echodataflow/</guid><description/></item><item><title>A Summer of Refactoring Echoshader!</title><link>https://uw-echospace.github.io/2023/09/18/a-summer-of-refactoring-echoshader/</link><pubDate>Mon, 18 Sep 2023 00:00:00 -0800</pubDate><guid>https://uw-echospace.github.io/2023/09/18/a-summer-of-refactoring-echoshader/</guid><description><p><em>Echospace recruited contributor
<a href="https://github.com/ldr426" target="_blank" rel="noopener">Dingrui Lei</a> in 2023 to refactor an echosounder data interactive visualization package called
<a href="https://github.com/OSOceanAcoustics/echoshader" target="_blank" rel="noopener">echoshader</a>.</em></p>
<hr>
<h1 id="my-2023-summer-internship-with-echoshader-a-dive-into-advanced-ocean-sonar-data-visualization">My 2023 Summer Internship with Echoshader: A Dive into Advanced Ocean Sonar Data Visualization</h1>
<p>Author:
<a href="mailto:leidingrui426@gmail.com">Dingrui Lei</a></p>
<p>Ref 1:
<a href="https://docs.google.com/presentation/d/1HmL2-luVmA9T5HfS3L1kBu8c7dDHo75znwaS-8YlTSE/edit#slide=id.p" target="_blank" rel="noopener">Slides</a> of presentation</p>
<p>Ref 2:
<a href="https://echoshader--140.org.readthedocs.build/en/140/intro.html" target="_blank" rel="noopener">Docs</a> for this version</p>
<p>Hello, readers! I&rsquo;m excited to share my summer internship experience working on the fascinating project, Echoshader. This Python package, designed to enhance the visualization of ocean sonar data, has been my focus this summer. While I won&rsquo;t be delving into technical jargon, I&rsquo;ll give you a glimpse of my journey, the challenges I faced, and the accomplishments achieved during my internship. The prototype was built during
<a href="https://summerofcode.withgoogle.com/programs/2022/organizations/ioos" target="_blank" rel="noopener">GSoC 2022</a>.</p>
<h2 id="echoshader-bridging-the-gap-in-ocean-sonar-data-visualization">Echoshader: Bridging the Gap in Ocean Sonar Data Visualization</h2>
<p>Before I jump into the technical details, let&rsquo;s take a moment to understand the significance of ocean sonar systems. These systems, including echosounders, are the unsung heroes of marine research. They help scientists study marine life by emitting sound waves and analyzing the echoes they bounce back. Think of it as an underwater ultrasound for the ocean. The data generated from these systems is invaluable for monitoring and conserving our marine ecosystems.</p>
<p>Echoshader, our summer project, aims to make this data more accessible and interactive. It&rsquo;s like a powerful toolset that enables scientists and researchers to visualize and analyze ocean sonar data effortlessly. But let&rsquo;s get into the nitty-gritty of my experience.</p>
<h2 id="building-the-echoshader-a-structured-journey">Building the Echoshader: A Structured Journey</h2>
<p>My summer project was all about creating and refining the Echoshader package. This package is the backbone of our mission, providing oceanographers and researchers with the tools they need to visualize and understand ocean sonar data. Here&rsquo;s how I structured my work:</p>
<h3 id="1-the-echoshader-class-a-controller-for-visualization">1. The Echoshader Class: A Controller for Visualization</h3>
<p>At the heart of Echoshader lies the Echoshader class. This class is like the conductor of an orchestra, coordinating user interactions, data updates, and visualizations. My task was to make sure this class was robust and user-friendly.</p>
<p>I defined the class and set up initial values and interactive widgets. These widgets allow users to tweak parameters and explore data interactively.</p>
<h3 id="2-callbacks-and-streams-making-it-interactive">2. Callbacks and Streams: Making It Interactive</h3>
<p>Echoshader needed to be interactive, allowing users to explore data dynamically. This required creating callback methods and stream objects. These elements connected user interactions to visualization updates, making the whole experience smooth and intuitive.
<img width="594" alt="image" src="https://github.com/ldr426/add-ldr426-page/assets/56751303/472ca1bf-4ec3-4e27-82db-20dba3f7fa58"></p>
<h3 id="3-extending-xarray-with-accessors-a-new-level-of-functionality">3. Extending <code>Xarray</code> with Accessors: A New Level of Functionality</h3>
<p>One of the exciting challenges I encountered was extending <code>xarray</code>&rsquo;s functionality using accessors. This means adding custom methods and functionality to <code>xarray</code> objects, without cluttering the code with custom functions. We created a custom &ldquo;eshader&rdquo; accessor, which allowed us to take echogram visualization to the next level.</p>
<h2 id="a-glimpse-into-echogram-visualization">A Glimpse into Echogram Visualization</h2>
<p>Echogram visualization is where the magic happens. It&rsquo;s not just about pretty pictures; it&rsquo;s about gaining insights into marine life and ecosystems.</p>
<ul>
<li><strong>Echograms for Identifying Fish</strong>: Fisheries scientists rely on echograms to identify fish aggregations, scrolling through data collected on ships to assess populations.</li>
<li><strong>Echograms for Observing Zooplankton</strong>: Oceanographers use echograms to observe zooplankton movements in mooring data over extended periods.</li>
<li><strong>Tricolor Echograms</strong>: The &ldquo;tricolor&rdquo; echogram helps distinguish different fish species, thanks to its clever mapping of three frequencies to RGB colors.</li>
</ul>
<img width="613" alt="single_frequency_echogram" src="https://github.com/ldr426/add-ldr426-page/assets/11621647/a51a6a76-c73d-46c1-84dd-8df643438f07">
<img width="613" alt="tricolor_echogram" src="https://github.com/ldr426/add-ldr426-page/assets/11621647/0ba62c35-5dd0-41db-9225-db0290be1215">
<h2 id="tracking-and-curtain-visualization">Tracking and Curtain Visualization</h2>
<p>One of the most exciting aspects of Echoshader is tracking and curtain visualization. It&rsquo;s like having a GPS for underwater data.</p>
<ul>
<li><strong>Echogram-Control Mode</strong>: Visualizing data on a map helps assess fish associations with environmental variables.</li>
<li><strong>Track-Control Mode</strong>: Highlighting ship track sections on the map while viewing corresponding echograms offers precise insights into marine life at specific locations.</li>
<li><strong>Curtain Visualization</strong>: Representing longer data sections as curtains provides a broader spatial perspective on fish aggregations.
<img width="609" alt="image" src="https://github.com/ldr426/add-ldr426-page/assets/56751303/dab25ce8-bba3-4062-85e9-4ce3231172fc"></li>
</ul>
<img width="502" alt="image" src="https://github.com/ldr426/add-ldr426-page/assets/56751303/2a1a33df-c639-4b53-9df2-ae2523cd3901">
<h2 id="histograms-and-statistics-tables-tools-for-deeper-analysis">Histograms and Statistics Tables: Tools for Deeper Analysis</h2>
<p>Histograms and statistics tables are essential for fisheries scientists.</p>
<ul>
<li><strong>Focused Analysis</strong>: Scientists can zoom in on specific data sections to examine volume backscattering strength (Sv) distribution and understand the types of fish present.</li>
<li><strong>Multi-Channel Comparisons</strong>: Comparing Sv distributions across multiple echosounder channels helps determine fish aggregation composition, offering valuable insights into the ecosystem.
<img width="722" alt="image" src="https://github.com/ldr426/add-ldr426-page/assets/56751303/623ca032-cab8-4fe0-8f37-57774a64e1b0"></li>
</ul>
<h2 id="in-conclusion-an-incredible-summer-journey">In Conclusion: An Incredible Summer Journey</h2>
<p>My summer internship with Echoshader has been a remarkable journey. I&rsquo;ve had the privilege of contributing to a project to advance oceanographic research and fisheries science. Echoshader isn&rsquo;t just a package; it&rsquo;s a gateway to uncovering the secrets of our oceans.</p>
<p>If you&rsquo;re curious about ocean sonar data or want to explore the world of marine life, Echoshader is your partner in discovery. Feel free to reach out if you have questions or want to join us on this exciting journey. Until next time, happy exploring!</p></description></item><item><title>Investigation of duty cycles in passive acoustic bat monitoring</title><link>https://uw-echospace.github.io/talk/202305-urp-symposium/</link><pubDate>Fri, 19 May 2023 15:30:00 +0200</pubDate><guid>https://uw-echospace.github.io/talk/202305-urp-symposium/</guid><description/></item><item><title>Hello from Dingrui Lei, GSoC contributor of Echoshader!</title><link>https://uw-echospace.github.io/2022/07/28/hello-from-dingrui-lei-gsoc-contributor-of-echoshader/</link><pubDate>Thu, 28 Jul 2022 00:00:00 -0800</pubDate><guid>https://uw-echospace.github.io/2022/07/28/hello-from-dingrui-lei-gsoc-contributor-of-echoshader/</guid><description><p><em>Echospace collaborates with the
<a href="https://ioos.us/" target="_blank" rel="noopener">Integrated Ocean Observing Systems (IOOS)</a> in the
<a href="https://summerofcode.withgoogle.com/" target="_blank" rel="noopener">Google Summer of Code (GSoC)</a> program in 2022 to jump start an echosounder data interactive visualization package called
<a href="https://github.com/OSOceanAcoustics/echoshader" target="_blank" rel="noopener">echoshader</a>.</em></p>
<p><em>
<a href="https://github.com/ldr426" target="_blank" rel="noopener">Dingrui Lei</a> is our great GSoC contributor, and our very own
<a href="author/don-setiawan">Don Setiawan</a> is the primary mentor.</em></p>
<hr>
<p>My name is Dingrui Lei and I am a new graduate student at Rice University. My experience has given me a broader understanding of how computer science knowledge can solve engineering problems and facilitate new tech development. I’d like to utilize my computer science knowledge to solve engineering problems.</p>
<p>Before contacting the
<a href="https://ioos.us/" target="_blank" rel="noopener">IOOS</a> community, I read the article &ldquo;
<a href="https://storymaps.arcgis.com/stories/e245977def474bdba60952f30576908f" target="_blank" rel="noopener">Understanding Our Ocean with Water-Column Sonar Data</a>,&rdquo; and an introduction to the project
<a href="https://uw-echospace.github.io/software/echopype/" target="_blank" rel="noopener">echopype</a>. Sonar is very intriguing to me, it can continuously detect the activities of sea creatures in the dimension of space and time. The depth of fish clusters changing with solar radiation really made me see the splendid usefulness of sonar data.</p>
<p><img src="https://ioos.us/images/IOOS_Emblem_Tertiary_B_RGB.png" alt="The IOOS Logo - The U.S. Integrated Ocean Observing System (IOOS)"></p>
<p>One of the main focuses of the
<a href="https://uw-echospace.github.io/author/echospace/" target="_blank" rel="noopener">Echospace</a> team is sampling and interpretation of ocean acoustic data.
<a href="https://github.com/OSOceanAcoustics/echopype" target="_blank" rel="noopener">Echopype</a> sits in the middle, extracts raw data from the cloud or file server, converts them to netCDF or Zarr, and performs denoising and calibration. Another job is to interpret, where I give my effort to build a library called echoshader that can help oceanographers discover certain patterns from it.
<a href="https://github.com/OSOceanAcoustics/echoshader" target="_blank" rel="noopener">Echoshader</a>, an open source project, aims to enhance the ability to interactively visualize large amounts of cloud-based data to accelerate the data exploration and discovery process. Ocean sonar data are generated from echopype, which handles the normalization, preprocessing and organization of echo data. Echoshader will be developed in parallel with the ongoing development of echopype.</p>
<p>As a participant of GSoC, I am developing the main APIs of echoshader based on the
<a href="https://holoviz.org/" target="_blank" rel="noopener">HoloViz</a> suite of tools, test configuration for using echoshader widgets in Panel dashboards, and create Jupyter notebooks to demo use of the combination of tools.</p>
<p><img src="https://miro.medium.com/max/1400/1*xQEm58a7c_g1Go9G5NMyuw.jpeg" alt="Deploying Panel (Holoviz) dashboards using Heroku Container Registry | by Ali Shahid | Towards Data Science"></p>
<p>Before starting coding, I read lots of documents to find the most suitable tool. Although there are many excellent and fantastic visualizing libraries with Python, such as plotly and bokh, they can not process xarray directly, which is a kind of multidimensional labeled data massively used in echopype. Then I locked my eyes on HoloViz ecosystem, whose tools and examples generally work with any Python standard data types (lists, dictionaries, etc.), plus Pandas or Dask DataFrames and NumPy, Xarray, or Dask arrays.
After determining which type of tool to use, I began to read a user guide about HoloViz libraries. There are several libraries mainly used in echoshader: hvplot, Holoviews, GeoViews and Panel. Hvplot and HoloViews declare objects for instantly visualizable data, building Bokeh plots from convenient high-level specifications. GeoViews visualize geographic data corresponding to ship survey datasets. Panel assembles grams and control widgets from these different libraries into a layout which could be displayed in a Jupyter notebook and in a standalone servable dashboard. In addition to HoloViz libraries, PyVista and other libraries are involved for 3D extension, which also fit well in panel layout. Also, benchmarking and doc work are required for each module.
Below are some screenshots of the different visualization functionalities I am developing:</p>
<p><img src="https://user-images.githubusercontent.com/15334215/186999651-76081a29-11f8-4d37-b3a9-fca0ad49a03c.png" alt="2d_echogram"></p>
<p><img src="https://user-images.githubusercontent.com/15334215/186999662-ba744a49-b02e-4451-a716-f8c8df654053.png" alt="tracks"></p>
<p><img src="https://user-images.githubusercontent.com/15334215/186999678-2bf77985-aab3-42f8-88f9-1f2c78d3b2eb.png" alt="curtain"></p>
<p>Although the project is not difficult, there are some other challenges I face. Learning Git and Github is a prerequisite for me to participate in open source projects for the first time. It is also my first time to collaborate with an English-speaking team. I had difficulty reading and writing English documents, not to mention, communicating. Fortunately, the mentors, Wu-jung, Don, Valentina, Brandon and Emilio are all kind and warmhearted, willing to give me suggestions and guidance.</p>
<p>I really recommend future GSoC participants select the IOOS organization and echospace team as your target and exploit your ability and talent to contribute to the community. Water is extremely significant for holding an adequate food supply and a productive environment for all living organisms. So working here can not just improve your coding and teamwork capability, but also create a beautiful tomorrow for ourselves and our Mother Earth.</p></description></item><item><title>Understanding echoes</title><link>https://uw-echospace.github.io/talk/202205-asa-denver-keynote/</link><pubDate>Mon, 23 May 2022 16:10:00 +0200</pubDate><guid>https://uw-echospace.github.io/talk/202205-asa-denver-keynote/</guid><description/></item><item><title>Discerning behavioral habits of echolocating bats using acoustical and computational methods</title><link>https://uw-echospace.github.io/talk/202205-urp-symposium/</link><pubDate>Fri, 20 May 2022 15:45:00 +0200</pubDate><guid>https://uw-echospace.github.io/talk/202205-urp-symposium/</guid><description/></item><item><title>Updates from Echopype developers: changes and roadmap</title><link>https://uw-echospace.github.io/talk/202204-wgfast-echopype/</link><pubDate>Wed, 27 Apr 2022 18:10:00 +0200</pubDate><guid>https://uw-echospace.github.io/talk/202204-wgfast-echopype/</guid><description/></item><item><title>Summarizing low-dimensional patterns in long-term echosounder time series from the U.S. Ocean Observatories Initiative network</title><link>https://uw-echospace.github.io/talk/202204-wgfast-ooi-nmf/</link><pubDate>Mon, 25 Apr 2022 18:10:00 +0200</pubDate><guid>https://uw-echospace.github.io/talk/202204-wgfast-ooi-nmf/</guid><description/></item><item><title>Beluga whale (Delphinapterus leucas) acoustic foraging behavior and applications for long term monitoring</title><link>https://uw-echospace.github.io/publication/2021-castellote-etal-plosone-beluga-foraging-monitoring/</link><pubDate>Tue, 30 Nov 2021 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2021-castellote-etal-plosone-beluga-foraging-monitoring/</guid><description/></item><item><title>Scalable, interoperable processing of water column sonar data for biological applications using the echopype Python package</title><link>https://uw-echospace.github.io/talk/202110-ioos-dmac/</link><pubDate>Thu, 28 Oct 2021 15:00:00 -0500</pubDate><guid>https://uw-echospace.github.io/talk/202110-ioos-dmac/</guid><description/></item><item><title>Building a toolbox for studying marine ecology using large ocean sonar datasets</title><link>https://uw-echospace.github.io/talk/202110-uw-data-sci/</link><pubDate>Tue, 05 Oct 2021 16:30:00 -0700</pubDate><guid>https://uw-echospace.github.io/talk/202110-uw-data-sci/</guid><description/></item><item><title>Compact representation of temporal processes in echosounder time series via matrix decomposition</title><link>https://uw-echospace.github.io/publication/2020-lee-staneva-jasa-tsnmf/</link><pubDate>Mon, 30 Nov 2020 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2020-lee-staneva-jasa-tsnmf/</guid><description/></item><item><title>Echo statistics associated with discrete scatterers: A tutorial on physics-based methods</title><link>https://uw-echospace.github.io/publication/2018-stanton-etal-jasa-echo-stat-tutorial/</link><pubDate>Thu, 06 Dec 2018 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2018-stanton-etal-jasa-echo-stat-tutorial/</guid><description/></item><item><title>Macroscopic observations of diel fish movements around a shallow water artificial reef using a mid-frequency horizontal-looking sonar</title><link>https://uw-echospace.github.io/publication/2018-lee-etal-jasa-trex-fish/</link><pubDate>Tue, 18 Sep 2018 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2018-lee-etal-jasa-trex-fish/</guid><description/></item><item><title>Tongue-driven sonar beam steering by a lingual-echolocating fruit bat</title><link>https://uw-echospace.github.io/publication/2017-lee-etal-plosbio-rousettus-bp/</link><pubDate>Fri, 15 Dec 2017 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2017-lee-etal-plosbio-rousettus-bp/</guid><description/></item><item><title>News</title><link>https://uw-echospace.github.io/news/</link><pubDate>Fri, 01 Dec 2017 00:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/news/</guid><description>
<p><strong>[09/2025]</strong> Aditya and Wu-Jung published a paper in JASA on <a href="https://doi.org/10.1121/10.0039108">the effects of duty-cycling in PAM of bats</a>!</p>
<p><strong>[07/2025]</strong> Wu-Jung and YeonJoon published two papers in JASA, on <a href="https://doi.org/10.1121/10.0037038">porpoise movements during target discrimination</a> and <a href="https://doi.org/10.1121/10.0036904">dolphin head-related transfer function prediction</a>!</p>
<p><strong>[05/2025]</strong> Multiple Echospace members hosted the second <a href="https://boat-ocean-acoustics.github.io/">Bridge to Ocean Acoustics &amp; Technology (BOAT)</a> workshop in New Orleans!</p>
<p><strong>[03/2025]</strong> Multiple Echospace members will be hosting the first <a href="https://boat-ocean-acoustics.github.io/">Bridge to Ocean Acoustics &amp; Technology (BOAT)</a> workshop in Seattle!</p>
<p><strong>[02/2025]</strong> Wu-Jung was awarded for the 2025 APL Science and Engineering Achievement Award! Congratulations!</p>
<p><strong>[02/2025]</strong> Caesar was accepted to the UW ECE PhD program and will start this fall, continuing his research in Echospace! Congratulations, Caesar!</p>
<p><strong>[10/2025]</strong> <a href="https://uw-echospace.github.io/author/ameena-majeed">Ameena Majeed</a> joined Echospace as an Undergrad Research Assistant. Welcome!</p>
<p><strong>[08/2025]</strong> <a href="https://uw-echospace.github.io/author/aidan-lee">Aidan Lee</a> joined Echospace as an Undergrad Research Assistant. Welcome!</p>
<p><strong>[07/2024]</strong> Wu-Jung gave a talk on our <a href="https://proceedings.scipy.org/articles/WXRH8633">Echostack software suite</a> and Valentina presented a poster on the <a href="https://proceedings.scipy.org/articles/JXDK4427">Echodataflow package</a> at the Scipy 2024 conference.</p>
<p><strong>[06/2024]</strong> Aditya attended the <a href="https://eos.unh.edu/center-acoustics-research-education/education/bioacoustic-summer-school-seabass">BioAcoustic Summer School (SeaBASS) in the University of New Hampshire</a> and met some inspiring lecturers and students! Wu-Jung also gave a lecture on Fundamentals of Ocean Acoustics!</p>
<p><strong>[05/2024]</strong> Wu-Jung and Aditya presented two talks in the <a href="https://acousticalsociety.org/ottawa/">Ottawa ASA Meeting</a> on <a href="talk/202405-asa-ottawa-hake/">evaluating the hake ML model</a> and <a href="talk/202405-aditya-asa/">the impacts of duty-cycle PAM for bats</a>.</p>
<p><strong>[05/2024]</strong> Wu-Jung gave a lecture on active acoustic ocean sensing and a best practice in scientific computing tutorial at <a href="https://acousticalsociety.org/asa-school-2024/">ASA School 2024 in Ottawa</a> as an instructor and met some wonderful students!</p>
<p><strong>[04/2024]</strong> Wu-Jung and Valentina presented two talks at the <a href="https://www-iuem.univ-brest.fr/wgfast/?lang=en">WGFAST 2024</a> meeting in France on pipelines and software tools for echosounder data processing on both ship and cloud.</p>
<p><strong>[04/2024]</strong> Wu-Jung presented on <a href="https://echolevels.readthedocs.io/en/latest/levels_proposed.html">Echosounder Data Processing Levels</a> (with contributions from Emilio, Brandyn, and Valentina) at the Global Acoustics INteroperable (GAIN) workshop associated with the WGFAST 2024 meeting.</p>
<p><strong>[03/2024]</strong> Wu-Jung and Valentina hosted 2 Capstone teams in the Master of Science in Data Science program for sonar data processing and automatic bat call detection.</p>
<p><strong>[12/2023]</strong> We welcome <a href="https://uw-echospace.github.io/author/brandyn-lucca">Dr. Brandyn Lucca</a> to join Echospace as a <a href="https://seeyourselfapl.uw.edu/seed-postdoctoral-fellowship/">SEED</a> postdoctoral fellow!</p>
<p><strong>[12/2023]</strong> Soham gave a talk at 2023 PyData Global &ndash; check out his <a href="https://global2023.pydata.org/cfp/talk/FF9MXK/">abstract</a> and the <a href="https://www.youtube.com/watch?v=9hr9rSOq5jQ">video recording</a>!</p>
<p><strong>[10/2023]</strong> Valentina and Wu-Jung gave a talk and a poster presentation in the <a href="https://meetings.pices.int/meetings/annual/2023/PICES/news">2023 North Pacific Marine Science Organization (PICES) meeting</a> in Seattle.</p>
<p><strong>[09/2023]</strong> YeonJoon was selected as a <a href="https://escience.washington.edu/people/postdoctoral-fellows/">UW Data Science Postdoctoral Fellow</a>.</p>
<p><strong>[09/2023]</strong> Wu-Jung joined the NOAA NCEI Water Column Sonar Data Archive stakeholder workshop and engaged in Echopype Q&amp;As.</p>
<p><strong>[08/2023]</strong> Wu-Jung, Emilio, and Valentina hosted <a href="https://oceanhackweek.org/about/pasthackweeks.html#ohw23">OceanHackWeek 2023</a> at UW with an international organizer team!</p>
<p><strong>[05/2023]</strong> <a href="https://uw-echospace.github.io/author/caesar-tuguinay">Caesar Tuguinay</a> joined the Echospace group as a Research Assistant. Welcome!</p>
<p><strong>[05/2023]</strong> We welcome Dingrui Lei and <a href="https://uw-echospace.github.io/author/soham-kishor-butala">Soham Butala</a> to join Echospace as summer interns!</p>
<p><strong>[06/2023]</strong> Valentina and Wu-Jung went to sea with the Hake survey on the NOAA FSV Bell M. Shimada!</p>
<p><strong>[05/2023]</strong> YeonJoon and Wu-Jung gave two talks on target discrimination behavior by a harbor porpoise and numerical modeling of sound transduction in dolphin head in the <a href="https://acousticalsociety.org/asa-meetings/">Chicago ASA meeting</a>.</p>
<p><strong>[05/2023]</strong> Aditya presented his ongoing research on the effects of subsampling for passive acoustic monitoring of bats at <a href="https://expo.uw.edu/expo/apply/676/proceedings">UW&rsquo;s 26th Annual Undergraduate Research Symposium</a>.</p>
<p><strong>[03/2023]</strong> Valentina and Wu-Jung gave two talks on our software and machine learning developments at the <a href="https://seagrant.umaine.edu/focus-areas/healthy-coastal-ecosystems/ices-fisheries-and-plankton-acoustics-symposium/">2023 ICES Fisheries and Plankton Acoustics Symposium</a> in Portland, Maine, and together with Emilio engaged with the international fisheries acoustic community in data format discussions.</p>
<p><strong>[02/2023]</strong> Wu-Jung was invited to serve on the committee for <a href="https://www.nationalacademies.org/our-work/ocean-acoustics-education-and-expertise">Ocean Acoustics Education and Expertise</a> of the National Academies, comissioned by ONR.</p>
<p><strong>[12/2022]</strong> Aditya has been awarded the <a href="https://expd.uw.edu/mge/apply/research/">Mary Gates Research Scholarship</a> for his research on passive acoustic monitoring of bats in the Union Bay Natural Area!</p>
<p><strong>[08/2022]</strong> Many of us in Echospace and alumnus Derya are hosting the <a href="https://oceanhackweek.github.io/ohw22/index.html">OceanHackWeek 2022</a> <a href="https://oceanhackweek.github.io/ohw22/seattle/index.html">Northwest satellite</a> this week!</p>
<p><strong>[07/2022]</strong> We welcome <a href="https://uw-echospace.github.io/author/yeonjoon-cheong">Dr. YeonJoon Cheong</a> to join Echospace as a postdoc scholar!</p>
<p><strong>[05/2022]</strong> We have released a <a href="https://echopype.readthedocs.io/en/stable/whats-new.html">new, major version of echopype, 0.6.0</a>. There are significant breaking changes, but also significant improvements in convention adherence, consistency across sensors, and dataset documentation.</p>
<p><strong>[05/2022]</strong> Wu-Jung will be giving the keynote lecture on &ldquo;Understanding Echoes&rdquo; in the <a href="https://acousticalsociety.org/asa-meetings/#KL">ASA Denver meeting</a>.</p>
<p><strong>[05/2022]</strong> Aditya gave a talk on using machine learning to monitor bats in <a href="https://expo.uw.edu/expo/apply/635/proceedings">UW&rsquo;s 25th Annual Undergraduate Research Symposium</a>.</p>
<p><strong>[04/2022]</strong> Wu-Jung and Emilio gave two talks on <a href="https://uw-echospace.github.io/talk/202204-wgfast-echopype">echopype updates and roadmap</a> in the 2022 WGFAST meeting and the 2022 NOAA NCEI Water Column Sonar Data Archive workshop.</p>
<p><strong>[04/2022]</strong> Valentina gave a talk on <a href="https://uw-echospace.github.io/talk/202204-wgfast-ooi-nmf">analyzing OOI echosounder data using matrix decomposition</a> in the 2022 WGFAST meeting.</p>
<p><strong>[11/2021]</strong> Valentina gave a tutorial at the Seattle ASA meeting on <a href="https://acousticalsociety.org/wp-content/uploads/2022/01/MeetingInformationSeattle.pdf#page=3">Software Best Practices</a></p>
<p><strong>[11/2021]</strong> New paper <a href="https://doi.org/10.1371/journal.pone.0260485">&ldquo;Beluga whale (<em>Delphinapterus leucas</em>) acoustic foraging behavior and applications for long term monitoring&rdquo;</a> was published in PLOS One!</p>
<p><strong>[10/2021]</strong> New preprint <a href="https://arxiv.org/abs/2111.00187">&ldquo;Echopype: A Python library for interoperable and scalable processing of water column sonar data for biological information&rdquo;</a> was posted on arXiv!</p>
<p><strong>[10/2021]</strong> Emilio and Wu-Jung gave the IOOS DMAC webinar on <a href="https://uw-echospace.github.io/talk/202110-ioos-dmac">&ldquo;Scalable, interoperable processing of water column sonar data for biological applications using the echopype Python package&rdquo;</a>.</p>
<p><strong>[10/2021]</strong> Wu-Jung gave the UW Data Science Seminar on <a href="https://uw-echospace.github.io/talk/202110-uw-data-sci">&ldquo;Building a toolbox for studying marine ecology using large ocean sonar datasets&rdquo;</a>.</p>
<p><strong>[09/2021]</strong> Wu-Jung and Linda successfully completed this summer&rsquo;s fieldwork evaluating the use of an ADCP-equipped glider as a biological monitoring tool. Check out <a href="https://oceanexplorer.noaa.gov/technology/development-partnerships/21adcp-gliders/welcome.html">NOAA Exploration&rsquo;s coverage of this mission</a>!</p></description></item><item><title>Dynamic echo information guides flight in the big brown bat</title><link>https://uw-echospace.github.io/publication/2016-warnecke-etal-frontier-echo-flow/</link><pubDate>Tue, 01 Mar 2016 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2016-warnecke-etal-frontier-echo-flow/</guid><description/></item><item><title>Statistics of broadband echoes: application to acoustic estimates of numerical density of fish</title><link>https://uw-echospace.github.io/publication/2015-lee-stanton-joe-broadband/</link><pubDate>Tue, 01 Dec 2015 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2015-lee-stanton-joe-broadband/</guid><description/></item><item><title>Bats regulate biosonar based on the availability of visual information</title><link>https://uw-echospace.github.io/publication/2015-danilovich-etal-currbio-rousettus-light/</link><pubDate>Wed, 01 Jul 2015 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2015-danilovich-etal-currbio-rousettus-light/</guid><description/></item><item><title>Can the elongated hindwing tails of fluttering moths serve as false sonar targets to divert bat attacks?</title><link>https://uw-echospace.github.io/publication/2016-lee-moss-jasa-luna-moth/</link><pubDate>Fri, 01 May 2015 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2016-lee-moss-jasa-luna-moth/</guid><description/></item><item><title>Statistics of echoes from mixed assemblages of scatterers with different scattering amplitudes and numerical densities</title><link>https://uw-echospace.github.io/publication/2014-lee-stanton-joe-mixed/</link><pubDate>Mon, 13 Jan 2014 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2014-lee-stanton-joe-mixed/</guid><description/></item><item><title>Orientation dependence of broadband acoustic backscattering from live squid</title><link>https://uw-echospace.github.io/publication/2012-lee-etal-jasa-squid/</link><pubDate>Fri, 01 Jun 2012 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2012-lee-etal-jasa-squid/</guid><description/></item><item><title>Long-duration anesthetization of squid (Doryteuthis pealeii)</title><link>https://uw-echospace.github.io/publication/2010-mooney-etal-squid-sedation/</link><pubDate>Thu, 01 Apr 2010 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2010-mooney-etal-squid-sedation/</guid><description/></item><item><title>The acoustic field on the forehead of echolocating Atlantic bottlenose dolphins (Tursiops truncatus)</title><link>https://uw-echospace.github.io/publication/2010-au-etal-jasa-tursiops-suctioncup/</link><pubDate>Mon, 01 Mar 2010 09:00:00 +0000</pubDate><guid>https://uw-echospace.github.io/publication/2010-au-etal-jasa-tursiops-suctioncup/</guid><description/></item></channel></rss>