Skip to main content

IOS / ARKit and SceneKit Introduction

I am thinking to write a series of article blog on developing iOS app, but in term of experimental and educational level. No aim in particular, but I think documenting my baby steps with iOS app development might actually help others.

MORE FUN THAN I THOUGHT

Being exposed to Apple XCode for the first time (version 9 beta I am using) and Swift 4, and also iPhone capability to do Augmented Reality, really bring a whole lot of fun area I can explore.

Even as 3D artist with little programming knowledge like me, I can say that it just takes a bit of effort and motivation to brave yourself and try the app creation. Even if you are not going to make paid app, the act to actually make your iOS device to work for you using your own custom app is amazing.

Making program for desktop computer VS making apps for mobile device is a totally different set of skill, in a sense that for mobile device (comes with powerful camera, wifi, bluetooth, AR capable, GPS capable and a lot more) you have one tool to do a lot of things and you are mobile.

Prequisites:

- You need at least an iPhone or iPad with AR capability (A9 chip and up), iPhone 7 Dual Camera is great for AR starter
- You also require MacOS computer to do Apple iOS app development, I recommend MacBookPro.
- Some kind of basic programming background, Python is good, easy to switch to Swift. But you can also start with Swift Playground to understand all the concepts.
- Actually do learn Blender 3D as well. It's a challenge, but worth it.

XCODE

First step is to sign up as free Apple developer and download latest XCode. XCode is a powerful app authoring tool to design and build apps you can imagine.

You can actually create your own apps that you can install on your device before thinking of signing as Apple paid developer license to sell your apps at the App Store.

Even with free XCode account can take you quite far making your iPhone and iPad a much more valuable and powerful devices!

SWIFT LANGUAGE

Swift is a fun interactive programming language! Before Swift, working and making apps is actually a lot more complicated. At least that how I found.

Swift has simplicity of Python and also quick compile like JavaScript. Even as someone who self-taught in programming, I found Swift to be really easy to get into.

If you are just starting out, do check some of Apple given free educational tutorial at iBooks for Swift Playground (iPad app) and for XCode Swift Playground (if you don't have iPad).

SCENEKIT FOR 3D ARKIT

My biggest interest and focus with iOS app development is actually to work and play with 3D contents and presented it as Augmented Reality (AR) app.

This will become easier overtime, especially with service like Sketchfab will soon release their 3D AR app for iOS. This is actually a huge thing. Sketchfab is growing and it supports Blender (BLEND) 3D app creations. So for general users, seeing 3D will become common.

For us, individual, designer, independent artists and passionate creators, we can make our own custom app to assist our own works.

When you jump into AR, after watching WWDC 2017 presentation on ARKIT, soon enough you want to try their example XCode Project, but I think it is a good start to also start with XCode provided templates. With XCode 9, you have a few templates:
  • AR + SceneKit
  • AR + SpriteKit
  • AR + Metal
These templates will give you a nice starting point to make your AR app. Even though SpriteKit is primarily to work in 2D, SceneKit for 3D, and Metal for building your own rendering from scratch, they are all can work together under the hood. They are not separate subjects, and with ARKit, we are actually using Metal render engine of some sort, capable of merging and meshing real world with our own virtual 3D creation.

You can use SpriteKit 2D as material for your 3D SceneKit objects, and Metal as Shader, for example. I am still discovering a lot of potentials. Each area needs its own time to learn and explore, but eventually we can put everything together, I hope.

SCENEKIT WITH SWIFT vs WITH BLENDER

You can approach 3D contents creation from few different ways:
1. Via Swift Coding
2. Via XCode SceneKit Editor
3. Via Blender

"We Hearts Swift" web has some really cool example on generating Scene (SCN) geometry using Swift code:
https://www.weheartswift.com/introduction-scenekit-part-1/

I am actually very eager to do "We Heart Swift" SceneKit tutorial and apply it as AR.

Generating 3D objects via coding is the most "God-like" ability. If you can generate objects using code, you can perhaps also easily give custom personal interface for users to generate their own objects on the fly. This is the very basis of app creations.

With XCode app, you can actually generate simple 3D Scene, SCN format all from within XCode! The interface is very similar to Unity 3D Editor.



Using XCode Scene (SCN) editor is great to start with and remember you can always use Swift at later stage to manipulate, transform, animate every parameter and attribute of these 3D objects.

If you happened to already know 3D tools like Blender (open source) or other package like Maya, Houdini, 3dsmax, Cinema4D, Modo, Lightwave, you can easily bring your 3D contents as DAE format into XCode as ASSET. DAE file format will work great for App development.

The only big gotcha is that Blender Z-Up vs Y-Up that you need to be careful with.

NOTE: DAE Z-Up from Blender will actually works fine as it is, but the issue is when you convert the DAE into SCN, sometimes you get orientation issue. Supposedly there is a fix for this, whether via code or toggle.

Perhaps there are a few things that needs to be learned from documentation to know the limitation etc. DAE can be easily converted to SCN format, and vice versa using XCode.

"Salt Pig Media" blog is great for starting with SCENEKIT for Game Creation:
http://saltpigmedia.com/blog/First-steps-in-SceneKit#comment-3369171684

So, whether you want to code your 3D Scene 100%, or using GUI like inside XCode, or using Blender, it is all will work together eventually.

RUN YOUR FIRST AR FROM TEMPLATE IS VERY EASY. 

Build and Run button in XCode is actually one button you will need to press in order to make it to work. For AR, you need actual device to see it happening:





Soon enough you would want to:
- use your own 3D asset and character
- control your 3D AR using custom buttons and tap
- changing the size, animate the object, maybe using Swift code
- a lot more...

So there you go, making app is a long journey, but first step is the most important. Start with templates from XCode, or download projects by others at GitHub, and you are good to go to get a taste of actually making app.





Comments

Popular posts from this blog

PYTHON PROCESSING / It only really begins ...

Back into Processing again, with Python! While Daniel Shiffman is continuously inspiring us with his CODING TRAIN series on YouTube, using Processing with Java and Javascript language, I decided to free my mind and trying to really do something using Processing and Python language. Installing Python language version on Processing 3 is easy enough, just first download the latest Processing and install the Python language mode via Add Mode button. Other link that might help: https://github.com/jdf/processing.py http://py.processing.org/tutorials/ BLANK MODE As soon as Processing Python Mode, opens up and running I am presented with a  blank environment. Suddenly I recalled my journey in learning programming from zero until now... With Python, outside Processing, commonly people will be introduced to Python IDE or IDLE environment. Something that looks like Console Window or Command Prompt, where we type single liners command. Python Command Line and IDE normally have t

PYTHON / OpenCV, Recreate Uncanny Manga - Anime Style

Can you tell what it is? Computer Vision. Yesterday, I spend almost whole day exploring this opencv module using Python. What I discovered was revealing. Even at the very basic level, I could produce some interesting Image and Video manipulation using all the code collected from documentation and many, many blog tutorials. If you are a total noob like me, I am still getting used to knowing that the CV in OpenCV means Computer Vision! Actuallly, I recalled that I did try to get into OpenCV few years back ago, when I knew no Python and when Python opencv module was probably still early. It was all C++ code and it was a little bit too hard for me. I read a couple of books about opencv at the library, I did not understand a single thing. That was back then. Today, for strange reason, with a bit of knowledge of Python, I can go a little further. EDGE DETECT IN OPENCV Me holding you know what. What leads me this far is my curiosity on how we can replicate Wolfram Langu

WOLFRAM / Making Text With Rainbow Color

Continuing with my Wolfram Mathematica Trial Experience... I watched and went through some more Mathematica introduction videos, read lots of Mathematica documentation and also going through the Wolfram Lab Online again a few times. There are some major learning curves and Mathematica is a lot different from normal programming language. Sometimes there is a lot of interesting "shortcuts", say like: FindFaces[] , WordCloud[] . Sometimes I got a little confused on how one can do iterations. Normally FOR LOOP concept is introduced early, but in Wolfram, because everything is EXPRESSIONS and ENTITY (like OBJECTS), sometimes it gets quite quirky. Mind you I am still in the first impression and having to look at many tutorials. Lots of NEAT EXAMPLES from documentation, but sometimes I got lost. I found Wolfram to be really awesome with LIST and generating list. It's almost too easy when it works visually. I cannot explain them with my own words yet, but there are