New Gigs app uses AI to organise your live music memories


A new iPhone app called Gigs is aiming to change how music fans remember live events by turning scattered concert memories into a structured, searchable archive. Developed by indie creator Hidde van der Ploeg, the app uses artificial intelligence to organise past concert experiences into a personalised digital timeline.

The idea is simple: instead of letting ticket stubs, screenshots, and photos sit forgotten across devices, Gigs brings them together into one place – complete with details, stats, and memories tied to each event.

Turning Memories Into Data And Experiences

Gigs allows users to import information from multiple sources, including tickets, emails, screenshots, or even links to event pages. The app then uses on-device AI to extract key details such as dates, venues, and artist lineups, automatically building a structured record of each concert.

🎶 Our new app Gigs: Concert Tracking is now available on the App Store!
Your new personal concert diary, beautifully designed and intelligently powered.

Please help us spread the word! Tag that friend of yours you love going to concerts or festivals with.

Link and a 30% launch… pic.twitter.com/73hSathNxv

— Hidde van der Ploeg (@hiddevdploeg) April 16, 2026

Users who already track their concerts on platforms like Setlist.fm or Concert Archives can also import their history directly, making it easier to consolidate years of live music experiences.

Once added, the app offers additional features such as syncing concert dates to calendars, receiving reminders for ticket sales, and browsing expected setlists. After attending a show, users are prompted to rate the experience and upload photos or videos, gradually building a richer archive over time.

Why This Matters For Music Fans

Live music is often one of the most memorable experiences for fans, but the way those memories are stored is fragmented. Photos, videos, and ticket confirmations are typically scattered across apps and devices, making it difficult to revisit them meaningfully.

Gigs addresses this by centralising those moments into a single platform, effectively turning personal concert history into something closer to a digital scrapbook or timeline. The use of AI further reduces the effort required, automatically organising data instead of relying on manual input.

This also reflects a broader trend of apps using AI to transform unstructured personal data into more usable and meaningful formats.

What It Means For Users

For users, Gigs offers a more organised and interactive way to relive past concerts. Instead of scrolling through camera rolls or email inboxes, they can access a curated history of their live music experiences in one place.

The app also adds a forward-looking element. By integrating features like ticket alerts and setlist previews, it becomes not just a memory tool but also a discovery and planning platform for future events.

What Comes Next

Currently available on iOS, Gigs is launching at a time when AI-powered personal apps are gaining traction. As the app evolves, it could expand its features to include deeper integrations with music streaming services, social sharing tools, or even community-driven insights.

If successful, Gigs could redefine how fans document and interact with live music – turning fleeting experiences into lasting, structured memories powered by AI.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews



Researchers at the University of Washington have developed a new prototype system that could change how people interact with artificial intelligence in daily life. Called VueBuds, the system integrates tiny cameras into standard wireless earbuds, allowing users to ask an AI model questions about the world around them in near real time.

The concept is simple but powerful. A user can look at an object, such as a food package in a foreign language, and ask the AI to translate it. Within about a second, the system responds with an answer through the earbuds, creating a seamless, hands-free interaction.

A Different Approach To AI Wearables

Unlike smart glasses, which have struggled with adoption due to privacy concerns and design limitations, VueBuds takes a more subtle approach. The system uses low-resolution, black-and-white cameras embedded in earbuds to capture still images rather than continuous video.

These images are transmitted via Bluetooth to a connected device, where a small AI model processes them locally. This on-device processing ensures that data does not need to be sent to the cloud, addressing one of the biggest concerns around wearable cameras.

To further enhance privacy, the earbuds include a visible indicator light when recording and allow users to delete captured images instantly.

Engineering Around Power And Performance Limits

One of the biggest challenges the research team faced was power consumption. Cameras require significantly more energy than microphones, making it impractical to use high-resolution sensors like those found in smart glasses.

To solve this, the team used a camera roughly the size of a grain of rice, capturing low-resolution grayscale images. This approach reduces battery usage and allows efficient Bluetooth transmission without compromising responsiveness.

Placement was another key consideration. By angling the cameras slightly outward, the system achieves a field of view between 98 and 108 degrees. While there is a small blind spot for objects held extremely close, researchers found this does not affect typical usage.

The system also combines images from both earbuds into a single frame, improving processing speed. This allows VueBuds to respond in about one second, compared to two seconds when handling images separately.

Performance Compared To Smart Glasses

In testing, 74 participants compared VueBuds with smart glasses such as Meta’s Ray-Ban models. Despite using lower-resolution images and local processing, VueBuds performed similarly overall.

The report showed participants preferred VueBuds for translation tasks, while smart glasses performed better at counting objects. In separate trials, VueBuds achieved accuracy rates of around 83–84% for translation and object identification, and up to 93% for identifying book titles and authors.

Why This Matters And What Comes Next

The research highlights a potential shift in how AI-powered wearables are designed. By embedding visual intelligence into a device people already use, the system avoids many of the barriers faced by smart glasses.

However, limitations remain. The current system cannot interpret color, and its capabilities are still in early stages. The team plans to explore adding color sensors and developing specialised AI models for tasks like translation and accessibility support.

The researchers will present their findings at the Association for Computing Machinery Conference on Human Factors in Computing Systems in Barcelona, offering a glimpse into a future where everyday devices quietly become intelligent assistants.



Source link