One-session lab: Set up website tracking and measure conversions with GA4 and Hotjar
A one-class GA4 and Hotjar lab: install tracking, measure conversions, read heatmaps, and write two evidence-based UX recommendations.
If you need a website analytics exercise that students can finish in one class, this lab is built for you. In a single session, learners will install Google Analytics 4, create a few basic conversion tracking events, launch a Hotjar heatmap, and turn the findings into two concrete UX recommendations. The point is not to master every setting in GA4 or Hotjar; the point is to teach the workflow of observing behavior, identifying friction, and making a decision based on evidence. That makes this a strong GA4 lab for analytics classes, SEO courses, digital marketing modules, and student projects.
The lab is intentionally practical. Students do not just read about metrics; they set up tracking, verify it works, inspect a live page, and write a short report that links data to action. That mirrors how real teams work when they use website tracking tools to improve landing pages, campaigns, and content. It also reinforces a key lesson from analytics practice: traffic alone is not success. The real measure is whether people complete the action you want, which is why event setup matters as much as pageviews.
For a class that needs a clean, reproducible assignment, this guide gives you the lab plan, the instructor checklist, the student steps, a comparison table, a reporting template, and troubleshooting notes. If your learners are new to measurement, you can also pair this activity with a quick intro on digital systems, such as how teams think about data contracts, or how funnels are influenced by page design, similar to the ideas in visual comparison pages that convert. The result is a hands-on tutorial that feels realistic, not abstract.
1. What students will learn in this one-session lab
Install and verify GA4 measurement
Students will connect a website to Google Analytics 4, confirm that data is arriving, and understand the relationship between a pageview and an event. This is the foundation of any website analytics workflow because without reliable data, every later decision is guesswork. In practical terms, they learn how measurement starts, where data appears, and what a successful installation looks like. That turns the abstract term “tracking” into a visible process.
Create basic conversion events
The lab then moves from general traffic to meaningful actions. Students will create at least one or two events that represent a conversion, such as a form submit, a button click, a download, or a thank-you page view. In many classrooms, this is the first time learners see why conversion tracking is more useful than raw visitor counts. It helps them connect user behavior to business or learning goals.
Interpret behavior using Hotjar
Finally, students will use a Hotjar heatmap to see where users click, how far they scroll, and which elements attract attention. This is important because numbers in GA4 tell you what happened, but heatmaps help explain why it happened. In a classroom setting, that contrast creates an immediate learning moment: the same page can have healthy traffic and poor conversion if the call to action is hidden, confusing, or visually weak.
2. Lab setup: tools, accounts, and class timing
What you need before class begins
To keep the session moving, the instructor should prepare a simple website or demo landing page in advance. A WordPress page, a static HTML page, or a sandbox website is enough as long as students can add tracking code or use Google Tag Manager. You will also need a GA4 property, a Hotjar account, a sample conversion goal, and a browser where students can use the extension or snippet installer. If your class works in marketing or product teams, you can compare the setup mindset to planning for an office setup: the process is easier when every tool is ready before the work begins.
Suggested 75- to 90-minute schedule
A single class works best if it is tightly structured. Spend 10 minutes introducing the scenario and the conversion goal, 20 minutes connecting GA4, 15 minutes creating and testing the conversion event, 15 minutes configuring Hotjar, 10 minutes reviewing the first heatmap view, and the final 10 to 20 minutes drafting the short report. This makes the lab realistic but manageable. The time pressure also mirrors how marketing teams often work when they need a quick improvement before launching a campaign.
Instructor prep checklist
Before students arrive, verify that the demo site is live, the tracking IDs are available, and access permissions are clear. If students are using their own accounts, make sure privacy notices are covered and that any recording features are limited to the class demo environment. It also helps to prepare a sample report template so students know exactly what to submit. For a broader view of how tools support performance measurement, you can compare this workflow with the practical differences in Google Search Console, GA4, and heatmaps.
3. Step-by-step: install GA4 and confirm data collection
Create the GA4 property and data stream
Start in Google Analytics by creating a GA4 property if one does not already exist. Then add a web data stream for the class site and copy the measurement ID. If students are using Tag Manager, they can place the GA4 configuration tag there; if not, they can use the direct gtag installation method. The key educational goal is to show that analytics is not magic: it is a chain of configuration, code, and verification. That chain is what makes reliable measurement possible.
Install the tag and check real-time reporting
After placing the tag, open the site in a new browser tab and confirm data in GA4’s Realtime report. Students should see their own visit appear if the installation is correct. This immediate feedback is critical in a classroom lab because it reduces uncertainty and helps learners understand the relationship between code and reporting. If no data appears, the problem is usually a typo, a blocked script, or a timing issue with the tag container.
Common setup mistakes and how to fix them
Students often copy the wrong measurement ID, install both Tag Manager and gtag at the same time, or forget to publish the container. Another frequent issue is testing in a browser with privacy extensions that block analytics scripts. Teach students to change one variable at a time and retest. For teams that care about implementation discipline, this is similar to maintaining reliable pipelines in a supply chain hygiene workflow: small mistakes can contaminate the whole system.
4. Step-by-step: set up conversion events in GA4
Choose one clear conversion goal
Pick a single student-friendly goal such as “click the signup button,” “submit the contact form,” or “download the worksheet.” The best lab goals are easy to trigger, easy to observe, and easy to explain. That way, learners can focus on the logic of conversion tracking rather than getting lost in complex ecommerce setups. In a teaching environment, clarity matters more than scale.
Create the event with either GTM or GA4 event settings
If you use Google Tag Manager, create a click trigger or form submission trigger, then send the event to GA4 with a consistent name like generate_lead or cta_click. If the site has a thank-you page, you can track the page view as the conversion event. Either method is valid, but the event name should be descriptive and stable. Students should also mark the event as a conversion in GA4 so the report reflects the action as a business outcome, not just a signal.
Test the event and document the result
Students must complete at least one test action and confirm the event appears in DebugView or Realtime. The report should capture what triggered the event, where it appeared, and whether any delay occurred. This is the moment when analytics becomes reproducible. A useful classroom habit is to have students write a one-sentence verification note: “I clicked the CTA on the pricing page, and the cta_click event appeared in Realtime within 30 seconds.”
5. Step-by-step: run a Hotjar heatmap and observe behavior
Configure the heatmap on the target page
In Hotjar, select the page students want to evaluate, then generate a heatmap for clicks, scroll depth, or movement depending on the course objective. Click heatmaps are best for conversion tasks because they show whether users notice the primary action. Scroll maps are valuable if the page is long and students need to see whether the call to action appears above or below the fold. To ground this lesson in practical analytics comparisons, it helps to discuss where heatmaps fit among tools like Mixpanel, Matomo, and GA4.
Use sample behavior if live traffic is limited
Many student sites will not have enough traffic to produce a rich heatmap during class, so you should create controlled test sessions. Ask classmates to click specific elements, scroll in a normal reading pattern, and complete the conversion action. This gives the class enough behavioral data to interpret. In a one-session lab, this approach is normal and educationally sound because the goal is not statistical scale; it is learning how to read patterns and form hypotheses.
What students should look for
Students should look for three things: whether attention matches the intended conversion path, whether important elements are ignored, and whether the page creates hesitation. A CTA that receives few clicks may be too low on the page, too visually weak, or surrounded by distracting content. A form may be clicked often but abandoned if it asks for too much information. Those patterns are exactly the kind of evidence that can support a simple, defensible UX recommendations.
6. Turning analytics into decisions: how to write two UX recommendations
Recommendation format: evidence, issue, change
Each recommendation should follow a simple structure: what the data shows, what problem it suggests, and what change should be tested next. For example, if the heatmap shows users rarely click the main CTA, and the CTA is below a dense block of text, the recommendation might be to move the CTA higher and make it more visually distinct. This format keeps students from making vague statements like “improve the page.” It also mirrors how analysts and designers communicate in real teams.
Example recommendation 1: strengthen the primary CTA
If the click map shows low interaction on the main button, students might recommend larger button text, stronger contrast, or placement above the fold. They should explain why the change is likely to help and how success will be measured in GA4. That might mean tracking more cta_click events after the change or comparing conversion rates before and after. This is the kind of practical reasoning that connects analytics with design, similar to how teams use the psychology of spending on a better home office to justify a purchase that improves performance.
Example recommendation 2: reduce friction in the form
If the form is the conversion point but many users stop before completion, the second recommendation should address friction. Students can propose fewer fields, clearer labels, inline help text, or a more obvious privacy statement. The key is to choose a change that is small enough to test quickly but meaningful enough to affect behavior. A classroom lab should teach prioritization, not perfection.
7. Data comparison: what GA4 tells you vs what Hotjar tells you
Use both tools together, not separately
Students often treat GA4 and Hotjar as competing tools, but they solve different problems. GA4 quantifies what happened, while Hotjar shows how users interacted with the page. The combination is what makes the analysis strong. This is the same principle behind many successful measurement stacks: one tool gives structure, another adds context, and together they create a more complete picture.
Comparison table for class discussion
| Tool | Main question answered | Best use in this lab | Strength | Limitation |
|---|---|---|---|---|
| GA4 | What actions did users take? | Track pageviews and conversion events | Quantifies traffic and conversions | Does not explain user intent well |
| Hotjar Heatmap | Where did users click or scroll? | Observe attention and friction | Shows behavioral patterns visually | Needs enough interactions to be useful |
| Google Tag Manager | How are events deployed? | Send clicks and form submits to GA4 | Flexible and reusable | Can be confusing for beginners |
| Google Search Console | How do users find the site? | Context for organic traffic sources | Helpful for SEO discovery | Does not show on-site behavior |
| Matomo | How do users behave with privacy control? | Alternative analytics comparison | Privacy-focused option | Not usually the class default |
How to interpret the table
The table shows why analytics education should teach tool selection, not tool worship. Students should understand that a heatmap cannot replace a conversion event, and a conversion event cannot explain every interaction issue. Together, they create evidence for action. In higher-level classes, you can extend this idea by comparing analytics tools the same way teams compare product strategies or content formats.
8. Instructor rubric and student deliverables
Required deliverables
Each student or group should submit three items: a screenshot or proof that GA4 is receiving data, a screenshot of the Heatmap or recorded observation, and a one-page report with two UX recommendations. This keeps the assignment focused and assessable. A short deliverable is ideal for a one-session lab because it rewards analysis rather than long writing. If you want to reinforce content strategy thinking, you can also ask students to cite a related source such as tracking tools explained or website analytics tools as background reading.
Grading criteria
A simple rubric works best: 40% for correct setup, 30% for interpretation quality, 20% for the practicality of UX recommendations, and 10% for clarity and presentation. The student should not receive full credit for “installing GA4” unless they verify the data and explain what it means. Likewise, a recommendation should not score highly if it is generic or unsupported by evidence. This pushes learners toward professional-grade thinking.
Example of a strong student insight
A strong report might say that the CTA button is visible but gets few clicks because users’ attention clusters on an image gallery above it. The recommendation could be to move the CTA closer to the top and reduce competing visual elements. If the form receives partial starts but few completions, the student might propose a shorter form and a clearer reassurance line. These are exactly the kinds of concise, actionable UX recommendations that employers value because they can be tested immediately.
9. Troubleshooting, ethics, and classroom best practices
Privacy, consent, and safe testing
Analytics teaching should always respect privacy. If your class site is public, make sure any recording features are used in compliance with your institution’s policies and local regulations. If the demo site is internal, that is often the easiest option because you can keep the lab contained. The point is to teach measurement responsibly, not to normalize hidden data collection. This aligns with broader concerns about trustworthy digital systems, including the importance of privacy-first personalization.
How to handle missing or messy data
Sometimes the event fires late, duplicates, or does not appear at all. Teach students to inspect one layer at a time: tag placement, trigger condition, page state, browser restrictions, and report delay. In the classroom, messy data is a feature, not a bug, because it teaches diagnosis. It is also a reminder that digital measurement systems are only as reliable as their implementation and maintenance.
How to extend the lab after class
If you want a follow-up assignment, ask students to redesign the page based on their recommendations and rerun the heatmap or conversion test. This creates a natural before-and-after workflow. Learners can compare the new pattern with the original and explain whether the changes helped. That extension mirrors how teams iterate in real-world conversion optimization programs and gives the lab a complete action loop.
10. Sample one-page report template
Suggested report structure
Ask students to keep the report short and specific. A solid structure is: objective, setup, key findings, recommendation 1, recommendation 2, and next test. That structure is easy to grade and mirrors professional analytics memos. It also prevents students from writing a narrative without a decision. In short, the report should answer: what did we measure, what did we learn, and what should change next?
Example wording students can adapt
“Our goal was to measure signup interest on the course landing page. GA4 confirmed that the page received visits and the signup CTA generated tracked clicks. Hotjar showed that users spent more attention on the hero image than the signup button. We recommend moving the CTA higher on the page and reducing the number of fields in the signup form. The next test should compare conversion rate before and after these changes.” This is a compact, evidence-based model of analytics writing.
Why this assignment works for students
This lab succeeds because it gives learners a complete cycle in one sitting: configure, verify, observe, interpret, and recommend. That cycle is the core of practical analytics education. It also bridges SEO and UX, showing that search traffic only matters when the page can convert that traffic into a meaningful action. For that reason, this lab is especially useful in courses that want to connect content performance with on-page behavior.
Frequently Asked Questions
Do students need prior coding experience for this GA4 lab?
No. Students can complete the basic workflow with copy-paste installation, Google Tag Manager templates, or a prebuilt demo site. The goal is to understand measurement logic, not to become developers in one session. If students are more advanced, you can optionally include custom event naming or dataLayer examples.
What conversion should we track in a classroom lab?
Choose a simple, visible action such as clicking a signup button, submitting a form, downloading a worksheet, or reaching a thank-you page. The best conversion is one that can be triggered quickly and verified in real time. Avoid complex ecommerce or multi-step funnels unless the class already has experience.
Can GA4 and Hotjar be used together?
Yes, and that is the ideal teaching setup. GA4 gives you the numbers, while Hotjar gives you the behavior patterns behind the numbers. Together they help students explain not just what happened, but why it happened.
How much traffic do we need for a useful heatmap?
For live sites, more traffic is better, but a classroom lab can still work with controlled test sessions. Ask students to simulate realistic behavior so the class can learn how to interpret patterns. This is enough for training and report writing, even if it is not enough for statistical certainty.
What should the final UX recommendations look like?
They should be specific, testable, and tied to evidence. Good recommendations might include moving a CTA higher, changing button contrast, shortening a form, or reducing competing content. Avoid vague advice like “make the site better” or “improve user experience.”
How does this lab support SEO learning?
SEO brings users to a page, but analytics shows whether the page earns the result. By pairing conversion tracking with heatmaps, students learn that search visibility and conversion performance are connected. That is why this lab fits naturally into an Analytics & SEO pillar.
Final takeaway
This one-session lab gives students a complete introduction to modern measurement without overwhelming them. They install GA4, create conversion events, inspect a Hotjar heatmap, and translate the data into two realistic UX recommendations. That is a powerful classroom experience because it shows how analytics supports decision-making, not just reporting. It also gives learners a repeatable method they can use in future projects, internships, and jobs.
If you want to build on this activity later, expand it into a comparison exercise with website analytics tools, a search visibility lesson using Google Search Console, or a redesign project based on the same evidence. That progression helps students see analytics as an ongoing cycle: measure, learn, improve, and measure again. For more context on how visual structure can influence attention and click behavior, revisit visual comparison pages that convert and compare the design principles to the behavior patterns in your lab.
Related Reading
- Designing an AI‑Native Telemetry Foundation: Real‑Time Enrichment, Alerts, and Model Lifecycles - Useful for understanding structured measurement systems beyond basic analytics.
- Essential Tools for Maintaining Your Home Office Setup - A practical model for organizing a classroom workflow before the lab begins.
- How to Turn a Five-Question Interview Into a Repeatable Live Series - Helpful for turning findings into a repeatable reporting format.
- Designing Privacy‑First Personalization for Subscribers Using Public Data Exchanges - Good background on responsible data practices.
- Architecting Agentic AI for Enterprise Workflows: Patterns, APIs, and Data Contracts - A broader look at how reliable systems depend on good data definitions.
Related Topics
Daniel Mercer
Senior SEO Content Strategist
Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.
Up Next
More stories handpicked for you
How to brief a digital agency: a classroom checklist for real-world projects
Design assignments around synthetic personas and digital twins
Build a fast AI market-research workflow for student projects (6 steps)
How to use AI responsibly for SWOT and PESTLE assignments
Turn BrandZ into a classroom case study: Teach brand valuation with Kantar data
From Our Network
Trending stories across our publication group