The simplest things are often the hardest.

So had a few more successes last week.

I got CoreAudio/AudioUnit Swift code running in a XCode Playground. This was a big help in trying to understand how AudioUnits work, and how to work with C Pointers in Swift.

I also got a very simple CoreLocation application running, which retrieved the compass direction. [That is probably the easiest thing I have done so far.]

Now I have to vent just a little. In developing this app for iOS I am completely baffled about one thing about trying to develop and app for an Apple platform.

I really wish Apple’s documentation for development was a lot better. I cannot think of a single thing I have yet to try to do on iOS or macOS yet, in which I did not have to look up a tutorial or an example for how to do something from a third party source.

Compared to Microsoft’s MSDN it just seems that Apple’s developers’ documentation is pretty thin.

Sure Apple supplies some guides and sample code. However, I tend to find Apple’s choice of a sample code examples to be a little esoteric. Especially when I just happen to need just a really brain-dead example to help me grasp the concept on how I should use a framework.

I don’t know, I just find it funny.

10,000 more baby steps to go.

Ok the 4th of July through me off, but after two weeks there is a lot to update about.

First, I finally got a basic Structure app working on my iPhone. Photos below.

Camera image of Sofa on left, Structure Sensor Depth image on right.

Camera image of Sofa on left, Structure Sensor Depth image on right.

Camera image of desk and computer on left, Structure Sensor Depth image on right.

Camera image of desk and computer on left, Structure Sensor Depth image on right.

Camera image of backpack on chair on left, Structure Sensor Depth image on right.

Camera image of backpack on chair on left, Structure Sensor Depth image on right.

The biggest challenges what getting use to YpCbCr color space. Also, there was a small bug in the Structure SDK that cause the Synchronized frame calls not to work. However, considering my end goals it was not critical to use synchronized frames, yet.

Also I was able to get a very basic Audio and MIDI application running on the iPhone. This was crazy because I had to pull information from multiple sources to figure out how to do that. The current Apple example, is pushing AVFoundation, but I needed CoreAudio.

The point is, I am just glad I have been able to accomplish baby steps in both areas, using the sensor and generating sound. Now there is only like 10,000 more baby steps to go.