Back

BESALIR AR

Developed an AR-based wildlife learning app for elementary education, from UI design and 3D integration to a RESTful API backend.

Augmented Reality

Role
Developer & Researcher
Timeline
Nov 2022 — Dec 2023
Type
Academic
Team
Independent

Wild animal education in Indonesian elementary schools relies on static textbooks and posters that lack interactivity. Students struggle to engage with flat images of animals they have never encountered in person. No existing tool combined AR visualization, species information, audio playback, and knowledge assessment in a single mobile experience designed for Indonesian elementary students.

As a solo developer, I was responsible for the end-to-end process, including UI/UX design, custom icon creation, 3D asset integration, and full-stack development. I designed the interface in Figma with a child-friendly visual language: a warm safari color palette, illustrated wooden UI elements, and large touch targets for small hands. The app architecture was kept to 5 core screens (Home, Scan, Quiz, Result, About) to minimize cognitive load for young users. Before moving to high-fidelity designs, I created low-fidelity wireframes to map out the placement of the navigation elements, ensuring that the primary actions were always accessible without deep menus.

During the transition from wireframes to the final high-fidelity design, I intentionally improvised shapes and layouts based on iterative testing to improve usability for children. For the Home screen, navigation labels used simple Indonesian words ("Mulai", "Kuis", "Tentang", "Keluar") so that elementary students could operate the app independently without adult assistance. To boost user enthusiasm right from the start, an engaging background music track automatically loops while on the main menu. The About screen was designed to cleanly display developer and application information without overwhelming the main interface.

BESALIR home screen with safari-themed menu showing Mulai, Kuis, Tentang, and Keluar buttons
The final Home screen features large, illustrated buttons with clear, simple Indonesian labels designed for children.
BESALIR about screen showing application info and developer details
The About screen maintains the safari theme while presenting clear information about the app's version, developer, and purpose.

To ensure the app was educational and not just a novelty, I integrated a quiz module. The quiz fetches dynamic questions from the backend, testing the student's knowledge after they interact with the AR models. To make learning engaging and reduce anxiety around testing, I gamified the results. The quiz results screen intentionally avoids raw numeric scores (like 80/100). Instead, it uses a categorical scoring system.

Quiz interface showing a multiple-choice question about animal diet with illustrated buttons
The Quiz interface is designed to be highly readable with distinct multiple-choice options.

Based on their performance, students are awarded contextual titles ranging from "Pemula Satwa Liar" (Wild Animal Beginner) to "Ahli Satwa Liar" (Wild Animal Expert). This categorical approach makes feedback meaningful, encouraging children to replay the quiz to "level up" their title, rather than feeling discouraged by a low percentage score.

Quiz result screen awarding the 'Ahli Satwa Liar' title with a star graphic
Quiz results award categorical titles like "Ahli Satwa Liar" instead of numeric scores, turning assessment into a game.

The core AR experience was built in Unity using the Vuforia Engine SDK for marker-based image tracking. The application logic manages state transitions between scanning, displaying information, playing audio, and taking quizzes. The backend infrastructure supports this by serving dynamic content.

Application navigation structure diagram showing Splash Screen, Home Scene, and branching flows to Play, Quiz, About, and Exit screens
The system navigation architecture dictates how users move predictably between the Home, Scan (Play), Quiz, and About scenes.

Quiz data, animal information, and scoring categories are served via a RESTful API built with ExpressJS and Node.js, backed by MySQL and Nginx. This architecture allows content managers to update the educational material dynamically via a web dashboard, ensuring the app remains relevant without requiring users to download a new APK.

System flowchart detailing the logic of the AR marker detection, API requests, and data rendering
The system flowchart illustrates the sequence of operations from camera initialization to marker detection and API data retrieval.

In AR applications designed for children, unclear system states lead to immediate frustration. The app needed a robust way to communicate when the camera failed to find a marker, handle the simultaneous detection of multiple objects, and present detailed educational text without cluttering the camera view.

I implemented a visual feedback loop tied directly to Vuforia's tracking state. If a target is lost or not in the frame, the UI prompts the user with a distinct warning badge.

AR scan screen showing a red 'Marker is missing' badge when no target is detected
When the camera fails to detect a valid marker, a red badge immediately alerts the user, preventing confusion about the app's state.

Once valid markers are detected, the app renders the corresponding 3D models and automatically plays the specific wild animal's sound to create an immersive experience (this audio can be muted by tapping the speaker button). The tracking is robust enough to handle different physical media types simultaneously. During testing, the system successfully tracked and rendered two distinct 3D objects at once: one target displayed on a smartphone screen, and another printed on paper.

AR scanning in action, showing two 3D models rendered simultaneously over a phone screen and a printed paper marker
The tracking engine successfully renders multiple 3D models concurrently, even across different physical media (digital screen and paper).

For the educational content, I designed a progressive disclosure pattern for the animal information panel. When a user taps the 'i' (info) button, a small panel appears with a brief summary of the animal. If the child wants to read more, tapping the green 'SELENGKAPNYA' (Read More) button expands the panel, allowing them to scroll through a detailed textual description. This UX choice ensures the AR view isn't obstructed unless the child explicitly requests more information.

Animal information popup displaying species details with a summary and a green 'SELENGKAPNYA' button
The info panel uses progressive disclosure: showing a concise summary first, with the option to scroll through full details via the 'SELENGKAPNYA' button.

Test Cases Passed
37
Screens Built
5
Platforms Built
2
Paper Published
1

All 37 Black Box test cases passed after resolving 12 defects during iterative testing across the Unity mobile app and the web-based content management dashboard. The research findings were published in an international journal, serving as both an academic contribution and a practical template for developing AR-based educational applications.

Following the Multimedia Development Life Cycle (MDLC) methodology taught me the value of structured multimedia development. The six-stage process (Concept, Design, Material Collecting, Assembly, Testing, Distribution) prevented scope creep on a solo project that spanned UI design, 3D integration, API development, and academic writing simultaneously.

For future improvements, I want to add moving animations to the 3D objects to make them more dynamic, as they are currently static. Additionally, while the native mobile app remains the primary focus, I want to explore WebAR (using libraries like AR.js or 8th Wall) as a supplementary feature to allow users a quick way to experience the AR without the initial installation barrier. I would also design a formal user study with pre- and post-test assessments to quantify learning outcomes.