origo

MIT license

overview

we developed origo as part of a challenge focused on solving problems for specific professions. the idea was to replace the clipboards and paper used at archaeological sites with a robust digital solution. the app allows for detailed artifact registration, including technical data like materials, production techniques, and conservation status, as well as site information and process numbers.

more than just a simple form, origo acts as a multimedia field companion, allowing users to attach photos and audio recordings to each record, consolidating all excavation information into a single device.

from a technical perspective, origo was a deep dive into SwiftData for local persistence, handling complex models that connect artifacts to varied media. the integration with the Google Maps SDK was essential for capturing precise coordinates, which we converted to the UTM format (standard in archaeology).

another highlight was the implementation of the Speech API, which automatically transcribes user-recorded audio notes into text, saving valuable typing time. the interface, built 100% in SwiftUI, manages dense data input flows without sacrificing clarity and usability.

design choices

palette

bedrock #472500
rust #C54302
dust #F6D2B2
river #436187

typography

Brasilero 2018 Free / onboarding and logo
The quick fox jumps over the lazy dog
SF Pro Display / primary interface
The quick fox jumps over the lazy dog

some fonts used in this project are proprietary and may not display correctly if they are not installed on your system.

rationale

origo is an app designed to assist archaeologists with the work of recording and cataloging artifacts found in the field.

considering the app's purpose, the color palette was inspired by earthy and natural tones, referring to the soil and artifacts found during excavations.

the chosen colors convey a sense of connection with nature and the historical past.

the typography seeks inspiration in brazilian popular culture for the onboarding and the app name, using the Brasilêro font, created from the analysis of thousands of hand-painted signs and placards in urban and rural communities throughout brazil.

we chose this font because it refers to the national culture and brings a human, artisanal touch to the design.

for the rest of the app, we used the SF Pro Display font, which is modern, clean, and highly legible on digital screens, ensuring a practical and native reading experience for users.

furthermore, the app features one more artisanal detail that refers to fieldwork:

we scanned various leaves and small plants, whose silhouettes were vectorized and used as ornaments and visual details in the onboarding and the app logo, reinforcing the connection with nature and the archaeological environment.

tech stack

SwiftUI
UI framework
SwiftData
local persistence
Google Maps SDK
location services
AVFoundation
audio handling

code snippets

Draft State Management

swift
class RecordDraft: ObservableObject {
    @Published var id: UUID = UUID()
    @Published var name: String = ""
    @Published var createdAt: Date = Date()
    @Published var location: LocationInfoModel? = nil
    @Published var artifactData: ArtifactDataModel? = nil
    @Published var artifactDetails: ArtifactDetailsModel? = nil
    @Published var audios: [AudioModel] = []
    @Published var photos: [CapturedImageModel] = []
    @Published var geolocation: MapMarkerModel? = nil
    var asRecordModel: RecordModel {
        let artifact = ArtifactModel(
            location: location,
            artifactData: artifactData,
            artifactDetails: artifactDetails
        )
        return RecordModel(
            id: id,
            name: name,
            createdAt: createdAt,
            artifact: artifact,
            audios: audios,
            photos: photos,
            geolocation: geolocation
        )
    }
}

Speech Transcription

swift
class AudioTranscriber: ObservableObject {
    @Published var transcription: String = ""
    private let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "pt-BR"))
    private var request: SFSpeechRecognitionRequest?
    func transcribeAudio(url: URL) {
        SFSpeechRecognizer.requestAuthorization { [weak self] authStatus in
            guard authStatus == .authorized else {
                return
            }
            self?.request = SFSpeechURLRecognitionRequest(url: url)
            self?.request?.requiresOnDeviceRecognition = true
            guard let recognizer = self?.recognizer, recognizer.isAvailable else {
                return
            }
            guard let request = self?.request else { return }
            recognizer.recognitionTask(with: request) { result, error in
                DispatchQueue.main.async {
                    if let result = result {
                        if result.isFinal {
                            self?.transcription = result.bestTranscription.formattedString
                        }
                    } else if let error = error {
                        self?.transcription = ""
                    }
                }
            }
        }
    }
}

Google Maps Integration

swift
class MapsViewModel: NSObject, ObservableObject, GMSMapViewDelegate {
    @Published var cameraPositionLatitude: Double = 0
    @Published var cameraPositionLongitude: Double = 0
    @Published var utmString: String = ""
    @Published var mapView: GMSMapView!
    @Published var camera = GMSCameraPosition.camera(withTarget: .init(latitude: 0, longitude: 0), zoom: 15.0)
    private var locationManager: LocationViewModel = .init()
    private var formatter: LocationCoordinateFormatter!
    override init() {
        super.init()
        formatter = LocationCoordinateFormatter()
        formatter.format = .utm
        mapView = GMSMapView(frame: .zero, camera: camera)
        mapView.mapType = .hybrid
        mapView.isMyLocationEnabled = true
        mapView.delegate = self
    }
    func updateCurrentLocation() {
        self.cameraPositionLatitude = locationManager.latitude
        self.cameraPositionLongitude = locationManager.longitude
        camera = GMSCameraPosition.camera(withTarget: .init(latitude: self.cameraPositionLatitude, longitude: self.cameraPositionLongitude), zoom: camera.zoom)
    }
    func updateUTMCoordinates() {
        self.cameraPositionLatitude = camera.target.latitude
        self.cameraPositionLongitude = camera.target.longitude
        self.utmString = self.formatter.string(from: CLLocationCoordinate2D(latitude: self.cameraPositionLatitude, longitude: self.cameraPositionLongitude)) ?? ""
    }
    func mapView(_ mapView: GMSMapView, idleAt position: GMSCameraPosition) {
        camera = position
        updateUTMCoordinates()
    }
}

credits