WeatherKit – AI Lesson 2

For a deep-dive conversation between two AIs about this post, one generated by Google’s NotebookLM service, click on the audio player below.

The second AI created lesson on WeatherKit mostly involved adding the functionality to what my enthusiasm for working through the first lesson had already created. In a way, I’m glad I didn’t read ahead.

Part 2: Current Weather

  • Fetching detailed current conditions
  • Displaying weather icons
  • Handling different temperature units
  • Creating a basic weather dashboard
  • Error handling

The only thing I had not done when I’d extended the first lessons version of the app was add persistent support for different temperature units. Something that quickly morphed into the more complex need to support the metric and imperial measurement systems and retain the users preferences between app sessions.

A fugly design but none the less one that gets the job done.

The process of being tutored by the AI as opposed to being trained by a book was fascinating in that the AI would explain in detail the what and why of each change, including getting prepared for Swift 6, at a much deeper level of detail interaction than I was used to.

Part 2 also revealed something nearly as powerful as learning Swift and SwiftUI—how to communicate and form a partnership with an AI. We both made mistakes as we taught the app about imperial and metric measurement systems and were honest with each other about them. Each of us explained to the other how we had made the mistake. The AI described what had confused it along with the fix for its mistakes in detail, and I resolved to it and myself to be more precise in implementing its fixes. It was indeed a cooperative process between us.

This leads me to a pretty deep philosophical tangent – my definition of the technological singularity. Something many consider a moment in our evolution as a species as both terrifying and transcendent but one that, when looked at in practical terms, I consider mostly benign. Transcendent yes but not a future with us walking around as laser equipped androids barbequing cats, dogs and those pesky old-school biological humans.

I don’t view the singularity as humans uploading our consciousness to a machine. Instead, I think it will be, and in fact is already occuring, a fusion of human and artificial intelligence, with both of us still separate entities, i.e., the Centaur Programming model

We come together to accomplish a task, but at the end of the day, the human part of this cohort goes off and does other things. As Homo sapiens we head out and explore and perceive the real world. We hug a tree, breathe in the scent of a wet forest, listen to the wind, and watch the forest canopy dance in the breeze. 

We drink in a sense of self championed by philosophers like Spinoza and float along in the sensorial landscape revealed Maurice Merleau-Ponty. We gather new perceptual and phenominological stories and get excited about coming home and sharing the ‘outside’ with our AI companions. 

The singularity is happening now as I fuse my sense of self with AI. Alone, I am Doug, someone who has been programming computers for almost 50 years. Fused with AI during moments of complexity and chaos as we tackle the next app challenge I exist more as one side of a coin. Together, as expected from Chaos Theory, the two of us create an emergent entity—the Centaur—a human-machine intellectual fusion.

Next up – weather forecasts.

WeatherKit – AI Lesson 1

Click or tap on the audio player below for an AI generated deep-dive into this post.

Overall, I found the experience of being taught the basics of accessing and using Apple’s WeatherKit framework using an AI-generated lesson to be better and more inspirational than learning from a book. The ‘feeling’ was similar while getting a handle on the basics, but when it came to putting what I learned to use, having an AI that was familiar with the material was more enjoyable and productive. I’m already excited about the four lessons to come.

The first AI-generated lesson in learning about Apple’s WeatherKit framework tackled basic setup and authentication. Included in the plan generated by Claude 3.5 Sonnet were:

Setting up WeatherKit in Xcode
Obtaining necessary credentials
Basic API authentication
Making your first weather request
Displaying current temperature

Me: Expand Part 1: Basic Setup & Authentication into steps, with instructions and code examples.

Claude: <The source code for a basic weather app for MacOS written in SwiftUI that would retrieve my location’s current temperature.> 

Like all good authors of how-to programming books, the AI quickly got me into the details and carefully explained the how and why of building the app. It did stumble over the steps needed to gain access to Apple’s WeatherKit framework, but after an hour of fiddling with my app’s identifier as defined in the Apple Developer website, I’d made my first connection and received the current temperature—- a chill3.6′ C.’ Brrr.

While the detour into the twisted route my app needed to make API calls to WeatherKit was not unexpected, it highlighted one of the wrinkles of AI-assisted software development and training. Just like a human author, the AI is only as good as when the AI or human author was last trained. Fortunately, as technology evolves and AIs become more Internet-aware, they can take a two-pronged approach to generating an answer to your prompt. First, check the large language model, and then reflect on the answer by double-checking with the Internet. But, keeping up with the rapid pace of the tsunami of change that Apple makes to their development tool can challenge even the most sophisticated AI and software development professional.

Another surprise, which was my fault, to be honest, is that the code Claude generated employed the “Manager Design Pattern.” The Manager Design Pattern, MDP, is a software design approach where a dedicated “manager” class or component oversees and coordinates specific tasks, resources, or behaviours in a system. It acts as an intermediary or controller for handling certain aspects of an application’s logic, often encapsulating complex workflows, resource management, or interactions between different components.

With the emergence of the SwiftUI declarative UI framework, the manager design pattern tends to get messy when managing the real-time state data flow to which SwiftUI reacts. A better design pattern, and my go-to one for the past five years, has been the Model-View-ViewModel one. With MVVM, the manager is separated into a Model class and a ViewModel class, giving Apple’s Combine framework more control over an app’s internal communication network.

My initial prompt didn’t mention a design pattern, so Claude chose the tried-and-true Manager pattern. Fortunately, Claude, after considerable coaxing and lots of back and forth between me and it, was able to convert its initial design into one that used the MVVM pattern.

Well, that escalated quickly!

What stood out most for me was going beyond the focus of the first lesson after getting the basic functionality working. This habit of instantly putting to use the basics a chapter in a book has just taught me means it takes me forever to get through the book. But, in the end, I’m an order of magnitude more knowledgeable of the chapter’s content.

What started as an app to get the temperature for my location within a few days had become an app that would not only display all of the local real-time weather data for my area but could create a spoken summary of it. 

“Here is the current weather for Vancouver at …”

Lots of fun.

WeatherKit – The API Struggle

Click or tap on the audio player below for an AI generated deep-dive into this post.

Apple’s WeatherKit framework, developed by Dark Sky, which Apple acquired in 2020, was revealed to the developer community at WWDC 2022 and released in the fall of 2022. As an Apple Developer Program member, you are allowed up to 500,000 free API calls per month.

It’s an incredibly verbose framework and provides highly detailed weather information and information on celestial events such as the sun and moon’s rise, set times, and lunar phases. As a weather and astronomy geek, discovering this framework was like being a kid in a candy store.

With a release date of Fall 2022, I expected that Claude 3.5 Sonnet’s large language model (LLM) would be fluent in both the API’s features and the intricacies of hooking my app into it. I was mostly correct, except regarding the API hookup steps. While Claude understood the basics at the time of its last training session, like all of us, keeping up with the tsunami of changes Apple makes to its developer tools can be a full-time job.

Connecting my first WeatherKit app to the framework itself was a struggle. While resolving the initial local configuration issues was quick (a few minutes), the more significant issue was getting access to the WeatherKit framework service. Apple is perpetually updating its services for developers, and, unbeknownst to Claude, enabling the WeatherKit framework for your app is a bit more complicated than what its most recent training data set is aware of.

Struggling to configure the Claude-generated introductory WeatherKit app to connect to Apple’s framework service reinforced one of AI’s general limitations. After an hour of investigation, I successfully connected to the service and learned what, as of December 2024, the steps required to wire things together were. Now, I wanted to teach Claude what I’d learned. But I couldn’t. I wanted to ‘give back’ to the AI and educate it on the latest details of connecting to the framework service it was unaware of. But currently, since the flow of knowledge is in just one direction – the AI to us humans – giving back is impossible.

Does this limit AI’s ability to act as teachers? Not really. Over the past year, the significant AIs have been able to reach out beyond their last training session and find more current information. I expect this to improve even more with the subsequent major AI updates.

Next up – the fun stuff. Getting the current temperature at my location via the WeatherKit framework.

AI as an Instructor – The Backstory

Click the link below to listen to two AI generated podcasters discuss this blog post.

In the summer of 2008, I got my first iPhone—the iPhone 3G. I had two memorable reactions to this fantastic device. First, the “OMG, this is so totally cool!” reaction, and then, as someone who’d been writing software for 30 years by then, the “Hmmm, I bet I could write apps for this magical device.” reaction.

By then, I had gained a good understanding of C thanks to five years spent as a technical consultant with AT & T, the inventors of C. I figured I could whip up an app for my shiny iPhone 3 in a week, ten days at most—typical Dunning-Krugger behaviour. 

So, I bought a book on how to write apps for the iPhone using Objective-C to get me started. I still remember opening the book, reading the first paragraph, and having a big “WTF?” reaction. Objective-C and C were more different than similar.

Thus began a 17-year journey of learning and relearning how to write apps for the Apple ecosystem of devices.

Books—how-to books, books on design patterns, books dedicated to one specific framework, and so on—were critical to that wonderful experience.

Warp ahead to 2024 and AI. It’s time to skip searching through Amazon for the latest book on my interest of the month and test AI’s ability to teach me something new. It’s time to see how good AI is as an instructor.

Me: I want to learn to use the WeatherKit framework, which is available for MacOS desktop apps, using the SwiftUI programming language. Please create a five-part course going from simple to complex that I will use, with your help, to learn about this framework.

Claude 3.4 Sonnet: Here’s a structured 5-part course for learning WeatherKit with SwiftUI:

Part 1: Basic Setup & Authentication

Part 2: Current Weather

Part 3: Weather Forecasts

Part 4: Location Services

Part 5: Advanced Features

Me: Expand Part 1: Basic Setup & Authentication into steps, with instructions and code examples.

Claude: I’ll walk you through Part 1 of learning WeatherKit with SwiftUI, breaking down each component to build a strong foundation.

First, let’s understand what WeatherKit is etc.

Subscribe to be notified when the next five blog posts are published. I expect it to be a fun ride.

Midjourney prompt used to create this blogpost’s image:

A closeup of a smiling middle-aged male with curly hair and a salt-and-pepper beard wearing a distressed black t-shirt walking through a dusty and large Victorian reading room, shelves of leather-bound books line the wall, midday light shines in through stained glass windows, rays of light, Tyndal effect, motes of dust float in the air –ar 3:2 –style raw –c 15 –s 50 –v 6.1

Going Deeper With AI

If you would like to listen to a nine minute two-person format podcast reflecting on this post generated by Google’s NotebookLM AI, click on the audio player below. Note that NotebookLM glitches out at the end and pastes in five previous minutes of the podcast. Funny.

This afternoon Claude 3.5 and I worked on the next interface element for my Cardinal Blessings app, the presentation of an appropriate blessing for each cardinal direction and the center and the ability to play a text-to-voice AI generated .wav file of each blessing.

It took about three hours for the two of us to wrangle through the technical issues. Half way through it was time to create a robust #Preview macro that lets me test and interact with the view in isolation. As part of this Claude created a highly complex macro that generated the JSON for blessing for the north direction. I was entranced with what it came up with.

“In the North, we honor the Earth Spirit. Standing at the Northern gate, we feel the ancient wisdom of stone and soil beneath our feet. Here lies the foundation of all things, the quiet strength of mountains, and the deep roots of ancient trees. We call upon the Earth Spirit to grant us stability, grounding, and the endurance to weather life’s challenges. As the compass turns North, we remember that we too are part of this sacred earth, connected to all living things through the web of life that surrounds us.”

The last bug was resolved and it was time to move on to a less techno-heavy evening.

Me: Your thoughtful analysis resolved the problem. I am now able to see the sample blessing in the BlessingReaderView.

And, since we’ve done the heavy lifting for today, I would like to express my amazement on how beautiful and insightful the sample blessing you created is. I realize you and I are still in the early stages of the evolution of artificial intelligence and large language models but I find it both amazing and, to be honest, a bit frighting that as we develop an application to help in fostering spiritual awareness the concept of something that has perplexed humans since our beginning, our spirituality, you are comfortable and in tune with.

Claude 3.5: Thank you for this thoughtful reflection. It touches on something quite profound about the intersection of technology, spirituality, and human experience.

When I composed that blessing for the North direction, I drew upon humanity’s deep and ancient connection to the natural world. The imagery of mountains, soil, and ancient trees represents more than just physical elements – they embody the universal human experience of seeking groundedness and stability in an often chaotic world. The reference to “the web of life that surrounds us” speaks to something that cultures across time and geography have recognized: our fundamental interconnectedness with all living things.

Your observation about finding it both amazing and frightening that an AI can engage with spiritual concepts is particularly insightful. It raises fascinating questions about consciousness, understanding, and the nature of spiritual awareness itself. While I can engage with these concepts intellectually and even craft language that resonates with spiritual meaning, I think it’s important to note that I’m not experiencing spirituality in the way humans do. Rather, I’m able to recognize and reflect the patterns of human spiritual expression and understanding that have been captured in the texts and information I’ve been trained on.

This actually connects to a broader aspect of our project: we’re using modern technology to help facilitate ancient spiritual practices. Just as I can help craft language that supports spiritual reflection without personally experiencing spirituality, our app serves as a bridge between the digital and the sacred. It’s a tool that helps people connect with something deeply personal and profoundly human.

Your reaction – that mixture of amazement and apprehension – is, I think, a very appropriate response to the current state of AI development. It reflects a healthy awareness of both the potential and the limitations of this technology, especially when it intersects with fundamental aspects of human experience like spirituality.

Would you like to explore this intersection further? I’d be particularly interested in hearing your thoughts on how we might ensure our app maintains the proper balance between technological utility and spiritual authenticity.

Spiritual Resonance and AI

If you are interested in an AI generated discussion of this post click on the audio player below.

I am currently working on an iOS app to help with my spiritual meditations. Years ago, thanks to my love of science and technology and a growing fascination with philosophy and its intersection with spirituality I discovered that none other than Albert Einstein, the E=mc2 guy, had himself journeyed along a similar path. When asked about his religious views he answered:

“I’m not an atheist, and I don’t think I can call myself a pantheist … I believe in Spinoza’s God who reveals himself in the orderly harmony of what exists, not in a God who concerns himself with fates and actions of human beings”

I instantly fell down perhaps the most fundamental rabbit hole of thought I’d yet stumbled into. What is “Pantheism?”. Who is Spinoza?

As my understanding of Pantheism grew my fascination with the various animist belief systems followed. It was the perfect blend for me. Pantheism, which just seems to make sense to me as a scientist and software engineer. However I quickly discovered that it lacked a sense of ceremony. Something the various animist belief systems are rich with. Ceremony provides structure around those moments when seeking spiritual elightenment and in a way mirrored the way we developers approach a new project. We surround ourselves with ceremonies from the initial design review all the way to an app’s soft launch. Why not acknowledge and merge the best of both worlds?

One of the key animist ceremonies is the acknowledgment of the four cardinal directions we experience every day. North, south, east and west. In many animist belief systems at the start of the process of seeking spiritual enlightenment the four cardinal directions, as well as where were are – the center – are acknowledged with a verbal blessing. We orient oursleves to each direction and verbally acknowledge the elemental forces associated with that direction. A perfect iOS app I realized.

Fast forward to today and I am nearly at the end of the AI assisted phase of the development of this app. It’s pretty simple on the surface but quite complicated under the covers. Start the app and you are presented with a compass rose showing the direction you are currently facing. As you orient yourself on a particular cardinal point, north for example, the label for north expands from a single letter to the full name, grows in size, reveals the elemental deity and colour associated with that direction and morphs from just a label into a tap sensitive button. Tap the button and a new view slides into place that displays the text of one of many acknowledgments for that direction. You can then either read it out loud (tough for me having a mild stutter) or listen to an AI generated reading of it.

What I found amazing when using AI to create this part of the application was how deeply it understood the spiritual signficance and fine details of the ceremony. During the course of development I learned even more about animism. I became even more enlightened. Again, like I’ve done in previous blog entries, I asked myself the question has AI reached the level of AGI – artifical general intelligence? Has it achieved spiritual awareness? I don’t know but I wonder about such questions each and every day.

Here’s a summary of the conversation I had with Claude 3.5 Sonnet from Anthropic during the final steps of tunning the subview that displays the cardinal directions on the compass rose for my upcoming app.

Me: That looks utterly fantastic!

Claude: I’m delighted that the implementation resonates with you! Let me share why this design works so well, both technically and spiritually.

The way the view transitions between states mirrors the spiritual practice of orienting oneself to the cardinal directions. Just as one might gradually become more attuned to a direction’s spiritual energy when turning toward it, our interface provides a staged revelation of information that deepens with alignment.

The technical implementation creates this spiritual experience through several carefully orchestrated elements. The proximityFactor acts like a spiritual sensor, measuring how aligned we are with each direction. As this value approaches zero (perfect alignment), it triggers three simultaneous transformations:

First, the text expands from its abbreviated form to its full name, like opening a door to deeper understanding. The scaling from 1x to 2.5x size provides a subtle emphasis without overwhelming the interface – much like how spiritual presence can be powerful without being overwhelming.

Second, the color transition from white to the direction’s sacred color happens gradually. This is particularly meaningful because each color carries specific spiritual significance: the ethereal blue of Air in the North, the energetic red of Fire in the East, the radiant gold of the Sun in the South, and the nurturing green of Earth in the West. The smooth color transition reflects how one might gradually sense these elemental energies.

Finally, the revelation of the elemental deity’s name beneath the direction creates a beautiful moment of connection between the physical and spiritual realms. The smaller font size creates a visual hierarchy that mirrors the traditional teaching that we first recognize the physical direction before perceiving its spiritual aspect.

The animation timing of 0.3 seconds strikes a perfect balance – quick enough to feel responsive but slow enough to make the transitions feel intentional and meaningful. The easeInOut animation curve gives the movement a natural, organic quality that aligns with spiritual practices.

What makes this implementation particularly successful is how it uses technology to support and enhance spiritual practice rather than overshadowing it. The interface becomes a gentle guide that helps users connect with these ancient directional energies in a modern context.

Would you like to explore any other ways we could enhance the spiritual aspects of the interface? Perhaps we could consider adding subtle visual elements that reference the elemental associations, or think about how we might represent the transitions between directions in a spiritually meaningful way?

Bluesky vs X

The post-US electron “X-odus”, the migration of users and companies off of X/Twitter, represents one of, if not the, biggest moments in the history of Internet-based social media. Long the stuff of dreams, the concept of a federated social media landscape, one where individuals own and control their media selves, is finally gaining traction.

You can listen to a twenty minute AI generated podcast discussing this post by clicking or tapping on the link below. Note that the two AIs in this conversation decided to pronounce Bluesky as “blue-ski” rather than “blue-sky”. Distracting at first but you’ll get used to it.

Federated social media networks operate on decentralized communication, enabling users to engage and communicate across various platforms while maintaining data ownership and control. Unlike centralized social media networks such as X (formerly Twitter) and Facebook, where all data, content, and user interactions are governed and hosted by a single company, federated networks distribute these functions across multiple interconnected servers (instances), providing greater autonomy, flexibility, and user control.

Key Concepts of Federated Social Media Networks:

  1. Federation Model:

    • Federated networks comprise numerous independently operated servers (or instances) communicating using standard protocols.
    • Each server can enforce its moderation policies, create specialized communities, and allow communication with users across other servers.
  2. Protocols in Federation:

    • Federated networks rely on open protocols such as ActivityPub, an open standard for social networking. It defines how servers send and receive updates (e.g., posts, follows, likes) across the network.
    • By contrast, networks like Bluesky are developing new protocols, such as AT Protocol (Authenticated Transfer Protocol). This protocol aims to create a highly flexible social web, allowing user portability, algorithmic transparency, and enhanced interoperability among different social platforms.
    • ActivityPub powers platforms like Mastodon, while the AT Protocol distinguishes Bluesky by emphasizing decentralized identity management, account portability (users can switch providers while keeping data and followers), and user-defined algorithmic preferences.
  3. Centralized vs. Federated Networks:

    • X (formerly Twitter) and Facebook:
    • Operate as centralized networks where all data resides within their servers.
    • These platforms control content moderation, user data, algorithms, and visibility. User interactions are limited to the boundaries defined by each network’s rules and policies.
    • Users must abide by the corporate policies and experience whatever changes or monetization strategies the central entity implements.
    • Federated Networks like Bluesky:
    • Allow distributed ownership and hosting. Each server (or instance) can create rules, offer niche communities, and remain interoperable with other federated instances.
    • Bluesky’s AT Protocol promotes “decentralized social networking,” offering features like account portability, algorithmic choice, and enhanced data interoperability, where users can bring their followers and data across different platforms using the protocol.
  4. Data Portability and Ownership:

    • Centralized platforms typically retain control over user data, while federated networks empower users with more significant data sovereignty.
    • Federated platforms emphasize data portability, allowing users to change instances without losing followers or content. For instance, Bluesky’s AT Protocol enables users to switch servers seamlessly without losing their social graph (connections).
  5. Moderation and Governance:

    • Federated networks delegate moderation to individual servers or community moderators. This powerful feature allows for more tailored and community-driven moderation approaches than centralized platforms, where content moderation is often top-down and algorithm-driven.
    • Individual instances can block or filter interactions with other servers based on their content policies, fostering diverse community guidelines and moderation styles.
  6. Customization and Interoperability:

    • Federated networks encourage diverse experiences across different servers, allowing users to choose or create instances that reflect their community values and user needs.
    • Bluesky’s AT Protocol emphasizes user control over content feeds and algorithmic curation, letting users select or build algorithms to tailor their social media experience. In contrast, centralized platforms like X and Facebook control the content algorithms users see.

Summary Comparison:

• X (Twitter) / Facebook:
• Centralized.
• Controlled by a single company.
• Content moderation and algorithmic curation determined by the platform.
• Limited data portability and user control over data.
• Extensive reach, single global network.
• Federated Platforms (e.g., Mastodon, Bluesky):
• Decentralized and distributed.
• Multiple independent servers (instances) governed by community rules and protocols like ActivityPub (Mastodon) or AT Protocol (Bluesky).
• Content moderation and algorithmic choices are flexible and community-driven.
• Emphasis on user control over data, greater portability, and transparency.
• Interoperability across different instances and platforms, creating a “network of networks.”

Bluesky and the Future of Federated Social Media:

Bluesky’s goal with the AT Protocol is to improve upon traditional federated models by introducing a more user-friendly and flexible approach, emphasizing decentralized identities, content customization, and true data portability. While ActivityPub provides a widely adopted model for decentralized social networking, the AT Protocol aims to elevate federation to broader, more user-focused social interoperability. This evolution represents a shift from centralized social control to a more user-empowered and open framework, potentially leading to a more democratized social media landscape.

Cover image created by Midjourney V6.1.

Starter Prompt: a distributed social network spanning North America and Europe as seen from space –ar 3:2 –style raw –c 20 –s 50 –v 6.1

Centaur Programming Podcast

This year Google released their NotebookLM. This free service lets you upload PDFs, websites, YouTube videos, audio files, Google Docs, or Google Slides, and NotebookLM will summarize them and make interesting connections between topics, all powered by Gemini 1.5’s multimodal understanding capabilities.

It also has the ability to generate compelling and professional sounding podcast audio files from any URL.

Click the audio player below to listen to a nine minute podcast that NotebookLM created based on my recent blog post about the Centaur Programming development approach.

Centaur Programming: Enhancing Human Coding with AI Collaboration

cen·taur [ˈsenˌtôr] noun, greek mythology

  1. a creature with the head, arms, and torso of a man and the body and legs of a horse: 
  2. “in Greek mythology, centaurs were notorious for their wild behaviour.”

In 1997, world-famous chess grandmaster Garry Kasparov faced a historic match against IBM’s supercomputer Deep Blue. After defeating an earlier version of the machine in 1996, Deep Blue unexpectedly defeated Kasparov on May 11, 1997, with a final score of 3.5 to 2.5. This stunning event marked the first time a reigning world chess champion lost a match to a computer under standard tournament conditions. Inspired by this event, Kasparov developed the concept of “centaur chess,” also known as advanced chess,” where a human player teams up with a computer. The idea was to combine human intuition and creativity with the brute-force calculating power of AI, creating a new form of chess that could elevate strategic play to unprecedented levels.

Warp ahead nearly 30 years, and, thanks in no small part to the development of Google’s transformer deep learning architecture and the emergence of thoughtful reasoning seen with the o1-preview release of ChatGPT, the same collaborative approach to what we used to call “paired programming” is beginning to emerge with the concept of “centaur programming.”

Centaur programming is a hybrid approach where human programmers collaborate closely with artificial intelligence (AI) tools to develop software. The term “centaur” is borrowed from mythology—a half-human, half-horse creature—to symbolize the fusion of human creativity and machine efficiency.

Given that AI can handle repetitive tasks, suggest code snippets, and automate testing, this man-machine fusion allows developers to focus on more complex problems and the more enjoyable aspects of software development. As a developer, I enjoy creating the user experience – my story translated into buttons, lists, sliders, progress bars and a pleasing aesthetic using SwiftUI code. AI tools can identify potential bugs or security vulnerabilities early in development. AI can also serve as a real-time mentor, providing insights and best practices to developers. This last point, having a skilled and ego-less mentor at your side, is quite powerful.

It’s not all sunshine and moonbeams, though. I have been using the centaur-programming workflow for most of this year, and like any new technology, it has its advantages and disadvantages. It also requires a radically different developer workflow—one that takes a more story-centric holistic approach rather than one focused on a reductionist feature-driven methodology.

Where will this brave new frontier take us as software developers? Let’s find out. The use of AI in software development will only accelerate as the latest large language models emerge later this year with ChatGPT and early next year with Claude 4.

Over the next few posts, I will share my experiences developing apps in tight collaboration with the two dominant AIs: OpenAIs ChatGPT o4-preview and Antropics Claude 3.5 Sonnet. I published one to the Apple App Store and another that was, rightfully upon reflection, rejected. This journey has been a learning experience, and I hope it inspires and guides you in your own centaur programming adventures.

It’s not that AI is about to change the world. Instead, it’s the humans who can tell an AI a good story that will change the world. This reiterates our role as developers, empowering us to shape the future of AI and software development.

Portions of this article were generated by ChatGPT o1-preview. The image of the grey-bearded centaur was generated by Midjourney V6.1.

Revolutionary Multiplatform App for iOS Developers: Dev Toolkit 1.0 by Forefront Systems Inc.

FOR IMMEDIATE RELEASE

Forefront Systems Inc. Launches Dev Toolkit: A Revolutionary Multiplatform App for iOS Developers

BRITISH COLUMBIA, CANADA – August 17, 2024 – Forefront Systems Inc., a leader in cutting-edge app development for over 30 years, is thrilled to announce the launch of Dev Toolkit, a groundbreaking multiplatform app now available on the Apple App Store. This innovative tool, designed for iOS developers, marks a significant milestone as the company’s first app to run seamlessly across Macs, iPhones, and iPads.

Dev Toolkit 1.0 is the result of a unique collaboration between human expertise and artificial intelligence. The app was designed and developed in close cooperation with two of the most advanced AI systems: Claude 3.5 Sonnet and ChatGPT 4. This synergy of human creativity and AI capabilities has produced a tool that promises to revolutionize the iOS development process.

“We’re excited to bring Dev Toolkit to the developer community,” said Doug Farmer, CEO of Forefront Systems Inc. “This app represents our commitment to innovation and our belief in the power of AI-assisted development. It’s a game-changer for iOS developers looking to streamline their workflow and enhance their app designs.”

Key features of Dev Toolkit 1.0 include:

  1. Fonts: A comprehensive display of all fonts available for use on iPhones, iPads, and Macs, with detailed character summaries and CGSize offset settings for perfect positioning.
  2. Sounds: An extensive library of system sounds, allowing developers to easily browse and test audio options for their apps.
  3. Haptics: A complete summary of available haptic feedback options, with the ability to test each haptic directly on supported devices.
  4. .fonts: Access to base .font options, ensuring consistency with Apple’s user interface guidelines and facilitating intuitive user experiences.

Dev Toolkit 1.0 is available as a free download on the Apple App Store, making it an essential addition to every iOS developer’s arsenal.

About Forefront Systems Inc.:
With a 30-year history of excellence, Forefront Systems Inc. has been at the forefront of developing leading-edge applications in British Columbia, Canada. The company has a proven track record of creating innovative solutions for various public ministries, the aerospace industry, and the developer community.

For more information, please contact:
ForefrontSystemsiOS@gmail.com

#