Participate in a User Study: Search Engine for Genome-Mapped Visualizations
Posted on 12 August 2025
Are you a genomics researcher who works with genome-mapped data and visualizations?
We want to hear from you!
What’s this about?
We’re developing a new search engine to help researchers easily find genomics visualizations and adapt them as customizable templates for their own projects, enabling a more efficient workflow for exploration and analysis. Your feedback will help us improve this tool and make it work better for scientists like you.
Who can join?
Anyone with experience working with genome-mapped data and visualizations, including but not limited to:
- Graduate students & postdocs
- Faculty & clinicians
- Bioinformatics researchers
- Software engineers working with genomics data
Genome-mapped data means your data is mapped to genomic coordinates. Only visualizations that display these coordinates are considered genome-mapped data visualizations. If this describes your work, you’re whom we’re looking for!
What to expect?
- Share your workflow: We’d like to start by hearing about your usual genomics visualization workflow to understand your context of use.
- Try our tool: Get a guided walkthrough and hands-on demo.
- Give us feedback: Tell us what works, what doesn’t, and how it could fit into your work.
Session details
- Format: The discussion will be informal and conversational. No preparation needed!
- When & where: In person at Countway Library or virtually via Zoom, August 13–15, 2025
- Duration: 30 minutes, scheduled at your convenience
How to sign up?
It’s easy!
-
Or, contact Huyen N. Nguyen at huyen_nguyen@hms.harvard.edu (subject: “User Study”), or send her a message on DBMI Slack @ Huyen N. Nguyen for a quick chat!
Questions?
Reach out via email (huyen_nguyen@hms.harvard.edu) or send a Slack message. We’re happy to chat!
We appreciate your time and expertise. Thank you for helping improve this tool for the genomics community!
This user study is part of a larger project by our team: Huyen N. Nguyen, Sehi L’Yi, Thomas Smits, Shanghua Gao, Marinka Zitnik, and Nils Gehlenborg. Source code is available on GitHub.