A big update to BitMaps, my #visionOS maps app, is out today! I've added a new split view for browsing locations. You can now adjust the terrain style of each individual map and set a default zoom level. And I've improved the "Open in BitMaps" share extension to support more types of addresses: https://apps.apple.com/us/app/bitmaps-bite-size-map-widgets/id6477943497
My little #RealityKit / #visionOS debugging tool is getting closer to release. Still a few things to be done, but I'm quite happy w/ it already. Completely written in #SwiftUI.
“This has come up several times on the forums, but I've never written it up in a standard place, so here it is: There are only three ways to get run-time polymorphism in Swift. Well, three and a half.”
macOS hive mind: If I were to write a small utility to view 3MF files (the new standard for additive manufacturing, which at a basic level is a triangle mesh), should I use SceneKit or RealityKit to render it?
Really, I'd be writing an importer (I can't seem to find one already in existence).
When SwiftUI was first released, one of the great features that piqued my interest was the instant preview function. This feature empowers developers to
Spectra for Prime Video is now OPEN SOURCE. I look forward to seeing this project grow to a degree that I could not make happen on my own!
🚀 Boosts Appreciated! 😃
For those who are unaware: Spectra is an app for Apple Vision Pro that allows you to watch Amazon Prime Video content. While the iPadOS app is available on #visionos, many have said it is hard to use. Spectra aims to solve that by wrapping the website in a native app using #swift and #swiftui.
As this is the first real #opensource project I’ve made, I’ve done some research and think I have setup this repository correctly. But I am open to suggestions on what things I need to look out for and what I should consider enabling or disabling to prevent unwanted things from happening.
Thinking about making this open source and publishing it for free on the App Store, (probably with a tip jar). I just don’t know enough about JavaScript to do this properly all on my own and I’m sure there are others who would have a good idea on how to structure the project better than I do.
Hey @MonaApp Do you all have plans to make a #VisionOS version of Mona? I'd love to test it if you do. I love the Vision Pro, and would really like to see my favorite Mastodon app there.
Today marks the release of @daypeek's biggest update yet: edit your schedule and add new calendar events with natural language input 🗓️🪄
While I plan to prioritize @zenitizer for some time to come, I do plan to eventually port Day Peek to #iOS, #watchOS and #macOS 🤓 (Please let me know, if you'd be interested in beta testing that if/when available)
The new Apple Vision Pro device is almost here, and SwiftUI is the best way to build a visionOS app quickly and natively. This week, we will continue the topic of the new SwiftUI APIs that we can use to adapt our apps to visionOS. We will learn about the new user interface component called ornaments.
If you are wondering why #SwiftUI attachments to #RealityKit entities in #visionOS don't seem to compile anymore: looks like the API got changed in the final release 🙂
The compiler is as helpful as usual in SwiftUI contexts, it suggests that the API is not available on visionOS on the old version ...