Skip to content

Shazam-like iOS app built with SwiftUI, ShazamKit and SwiftData for the 2nd Apple Developer Academy challenge.

Notifications You must be signed in to change notification settings

tryHarderSala/ShazamKit-Challenge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🎧 ShazamKit Challenge – SwiftUI + SwiftData Shazam-like App

Rebuilding the core user experience of Shazam using modern Apple frameworks:

Tap a big button → listen from the microphone → identify the song → show rich track details and keep a history.

This project started as a personal learning challenge around ShazamKit and evolved into a small but complete SwiftUI app with persistent history, navigation, and a polished UI.


🚀 Features

  • 🎵 One-tap song recognition

    • Uses ShazamKit and SHManagedSession to capture audio from the microphone.
    • Matches against the global Shazam catalog and returns full metadata (SHMatchedMediaItem).
  • 🌀 Recognition screen with animated UI

    • Central “Shazam-style” button with listening / idle states.
    • Visual feedback while the app is capturing audio.
    • Error and “no match” handling.
  • 📱 Result view

    • Displays artwork, title, artist and additional metadata when available.
    • Clean layout focused on the currently recognized track.
  • 📚 Library of recognized songs

    • Every successful match is saved using SwiftData (SongModel).
    • LibraryView shows a list of previously recognized songs, most recent first.
    • Artwork thumbnails, title and artist for quick scanning.
    • Ready for actions like tap-to-open-details or deletion.
  • 🧭 Modern navigation

    • NavigationStack-based flow.
    • Dedicated views:
      • HomeView
      • RecognitionView
      • ResultView
      • LibraryView
  • 🧩 Clean, focused architecture

    • AudioRecognition view model encapsulating ShazamKit logic.
    • SwiftUI views remain declarative and focused on rendering state.
    • Uses modern Swift features (async/await, ObservableObject, @StateObject).

💡 MusicKit integration (Apple Music playback) is planned / in progress and can be layered on top of the existing model (SongModel).


🧱 Tech Stack

  • Language: Swift
  • UI: SwiftUI
  • Data Persistence: SwiftData (SongModel)
  • Audio Recognition: ShazamKit (SHManagedSession)
  • Architecture: MVVM-style (view model + SwiftUI views)
  • Platform: iOS
  • Minimum iOS version: iOS 17+ (required by SwiftData)
  • Tools: Xcode 16+ (recommended), physical device for testing

📂 Project Structure (High-Level)

.
├─ CH2App.swift            // App entry point, app lifecycle
├─ AppDelegate.swift       // UNUserNotificationCenter delegate, app setup
│
├─ Models/
│  └─ SongModel.swift      // SwiftData model for recognized songs
│
├─ ViewModels/
│  └─ AudioRecognition.swift  // ShazamKit integration, recognition state
│
├─ Views/
│  ├─ HomeView.swift       // Root view, navigation/tab structure
│  ├─ RecognitionView.swift// Main recognition screen
│  ├─ ResultView.swift     // Detailed result of the last match
│  └─ LibraryView.swift    // Persistent library of recognized tracks
│
└─ Resources/
   └─ Assets, Info.plist, etc.

About

Shazam-like iOS app built with SwiftUI, ShazamKit and SwiftData for the 2nd Apple Developer Academy challenge.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages