HomeNewsLightwell 2.0 We’re very excited to announce the release of Lightwell 2.0. 👾

Lightwell 2.0 We’re very excited to announce the release of Lightwell 2.0. 👾

medium bookmark / Raindrop.io |

1*O_H6bHXRSnPsI9ifPMixTw.png

It includes new features like automatic asset scaling for retina devices and image gamma correction. There are also improvements and fixes to existing features such as additive alpha animations and macOS 10.13 path drawing.

However, what we’re especially excited to share is something that is much more subtle to notice. It is the reason that this is our biggest release since launch.

To understand this change, I want to share a bit of how Lightwell works.

As a creator in Lightwell, you will likely position and customize layers, choreograph animations, and add interactions. While you do this, Lightwell is generating a series of configuration files for your scenes. These configuration files are then read and translated into dynamic content when you run the project in the Lightwell Previewer or from Xcode.

All of this works through Hullabalu StoryKit “HSK”, a framework that we built for our Adventures of Pan app series. HSK manages everything from the reading of the configuration files to reading user and device inputs, from progressing dialogue within a scene to navigating an entire app project.

Over the last few months since Lightwell’s release, we have been so inspired by the community of creators and the projects being created. The more we’ve learned about what you wanted out of Lightwell, the more we came to realize that the old version of StoryKit needed some significant updates.

One of the questions we get asked the most is: “Can I publish to Android?” The main reason we have historically said “no” was due to our rendering engine.

Rendering whatsit?

A rendering engine is the code that actually does the heavy lifting of drawing image assets to the screen. This includes everything from reading in the pixel by pixel image data, positioning and transforming that image, animating that image, and finally to drawing each pixel on the screen.

1*7m5dULFwgg0s_NUXhJiORg.png

Rendering engines have a lot of components that can be designed to make things easy to develop, work for a wide range of applications, run fast, or perform expensive image effects. However, you can never get all of the features with any one rendering engine.

Hullabalu StoryKit was initially built with a rendering engine that was quick to set up, gave us access to the effects we needed at the time, and worked under a reasonable application use–many full screen’s worth of animating images like this. The drawback, it was iOS and macOS only. So, we looked around for alternatives.

We had a very specific set of requirements for our rendering engine.

  • Cross-platform codebase. Can run the same code in the editor software and in your final app. This is key to the what-you-see-is-what you-get (WYSIWYG) nature of Lightwell.
  • Free to distribute. We believe that since you’re the ones building the apps, you should have full ownership of them. That’s why we don’t ask for any sort of revenue share, and our rendering engine shouldn’t either.
  • Supports all our current features. In particular, grouping layers, timing paced animations, alpha hit detection, sprite texture animations, parallax and swing rotation.
  • API access. We want to give more power to creators, while keeping the tool intuitive to use. This means access via an easy-to-learn API so that scenes can be extended with code.
  • Support for future features. See below for a sneak peak of bigger items on our roadmap.

And, of course, the rendering engine has to be capable of doing all of this in real-time. Since mobile apps are inherently interactive, content needs to be dynamic and responsive.

After searching for a solution, we couldn’t find any preexisting solution that had everything we needed. So, we built our own. 🤓

Introducing HSK 2.0:

Harnessing the power of OpenGL, Hullabalu StoryKit now includes a rendering engine that is built explicitly for your Lightwell projects. That means it is optimized to animate your layers in real-time while fully responding to device inputs like touch and tilt.👆

We have the ability to deform layers for parallax without sacrificing performance, and to add in new functionality with first class support. It’s all built with Swift, so the code side interface is simple and easy to learn (we’ll be sharing more about the API soon).

This also means that we can start to take your apps cross-platform.

(👋 Android!)

As we build out cross-platform previewing and publishing support we have some other exciting features on the horizon.

#ComingSoon

  • Shape Layers: define dynamically drawn layers without the need of an imported image asset.
  • Master Scenes: build shared UI elements, basic interactions, and animations into a template scene and use it across your entire app.
  • Android Support: preview on Android and publish to the Google Play Store. Build once in Lightwell, deploy to any iOS or Android device.
  • Chrome OS: build on Mac and Chromebook devices. If you’re an educational institution using Chromebooks, check out more here.
  • Lightwell API: expand, explore, and learn through code access to Hullabalu StoryKit. Giving you code level access to HSK lets you extend and customize your app even further.

So, “can I publish to Android?”… Soon.

Thank you very much for you continued feedback and for sharing your incredible projects with us! We love hearing about what you’re putting together.

Drop us a line at support@lightwell.pro with any questions, requests, feedback, or just to say hello!

Happy creating! 🖖
Max

Featured articles on Prototypr: