Donnerstag, 18. Juli 2013

The way forward with Python on Qt 5 (with a bit of history)

As you have probably heard on Twitter from the official Jolla account, the first Jolla will ship with Wayland. In that discussion, some worries are brought up about Python support with Qt 5. Here are my personal thoughts of how I see mobile Python development moving forward with the new technology. So first some background information:

If you want to write Python applications on a mobile platform, you usually need some way to interface with the native graphical toolkit. On Maemo 4 and Maemo 5, this was done via PyGTK and some additional bindings for the Hildon UI elements. PyGTK is licensed under the terms of the GNU LGPL, so you could use it to develop open and closed source applications.

When MeeGo 1.2 Harmattan (the system running on the N9) came up, the toolkit with which third party application developers wrote their applications changed from Gtk to Qt (or to be more precise, QML). Just like pure Gtk didn't provide all the UI elements for mobile (and therefore Hildon was used on top of Gtk), pure QML provided very little, and so Qt Quick Components were used for the Harmattan user interface (that includes things like buttons, toolbars, menus, dialogs, etc...). These could only be used from QML, so you had to use QML if you wanted your application to look anything like native (ignoring the MeeGo Touch Framework here, as it wasn't really used for third party apps, with the notable exception of sociality-mtf).

With Qt and QML being the new toolkit of choice, Python developers needed a different "bridge" to go from the Python world to the toolkit world. PyQt already existed for some time then, it had two license options: GNU GPL (basically requiring all your code that uses PyQt to also become GPL'd) and commercial. As the Wikipedia article of PySide says, after Nokia failed to reach an agreement with Riverbank Computing (the developers of PyQt) to change the license to LGPL, they started their own Python bindings named "PySide", which are licensed under the terms of the GNU LGPL. There's a wiki page with PySide-PyQt differences on the Qt Wiki.

As MeeGo at Nokia was ramped down, the core team of PySide had less and less time to spend on improving it. PySide is still kept up to date (the latest release is from July 9, 2013), and it already supports Python 3, but nobody has yet ported it to Qt 5. The most recent post I found about PySide and Qt 5 from one of the core developers was from November 2011. For using Python with Qt 5, there are two possibilities at the moment: Use PyQt under the terms of the GNU GPL, or get a commercial PyQt license. Another option to get LGPL'd bindings would be to add support for Qt 5 to PySide, but that needs somebody to adapt PySide to work with Qt 5.

But let's step back a little now. In my experience with writing Python applications for mobile operating systems (starting with Maemo 4 all the way to MeeGo 1.2 Harmattan), there are really two parts: The frontend, which is specified in terms of whatever the toolkit of the day happens to be, and the backend, which (in case of Python applications) is written in Python.

In Qt 5 / QtQuick, the user interface language is QML, which includes a declarative language and JavaScript for scripting. Writing QtQuick applications with Python doesn't mean replacing JavaScript, it means replacing the "backend" - code that would usually be written in C++ and take care of storage and communication. In fact, as QML grew more powerful over the years, you had to "go down to the C++ level" (or Python in our case) less often. And compared to imperative user interfaces (where you still go through your tree of UI widgets with code and then set some button text to some value), in declarative user interfaces the backend really only provides services to the frontend (that button gets the label from the backend in this way, and when the data in the backend changes, so does the button's text).

So, what we really need is a way to provide services to the QML UI by using Python. There's no need for having access to all classes in all Qt modules - even in Qt4 / PySide times, the classes that I used most often were QApplication (for the event loop), QDeclarativeView (for displaying the QML UI) and QObject (for providing signals and slots to interface the backend to the view). Things like access to the contacts database or GPS location can usually be done directly from QML (and displayed there) - it might be that you don't even need to have that data in your backend (and if you do, the UI can for example send the phone number of the contact "down to" the Python backend for further processing).

But before I'll tell you where I think mobile Python development should be heading, let's bring up yet another disadvantage of using PySide (probably applies to PyQt as well) for mobile applications: Startup time and responsiveness. Startup time is slow, because PySide libraries are big (they are - at least on the N9 - bigger than the Qt libraries they are binding), and the dynamic linker has to resolve all Qt symbols at load time instead of just the ones you end up using. Instead of seeing the QML UI right away, your application first has to load up the Python interpreter, then load up the PySide modules and other modules and then finally load and display the QML UI. And because of Python's Global Interpreter Lock, function calls from QML to Python will make the UI block and not be totally responsive, even if you are using threads.

So, in my opinion, mobile Python applications with Qt 5 have to be fast to load, lightweight and responsive. Fast to load can be achieved by making sure the libraries that get loaded are small, and that ideally the QML UI can already be displayed before the Python interpreter is even loaded and/or initialized. Lightweight can be achieved by not binding all the Qt classes, but only the ones you really need for creating QML applications. And responsive can be achieved by making sure that the interface between QML and Python is asynchronous, so the UI never blocks even if Python is working hard in the background to fulfill the requests of the user interface.

I currently have a prototype running on Python 3.3 (Python 2.6 and 2.7 is also supported still, but for me, this is a great opportunity to migrate some of my old code to Python 3, and new code should be written in Python 3, because, as you know, Python 2 sucks) and Qt 5 (I've just ported it to Qt 5 today, but it runs just as well on Qt 4 - with all the advantages like startup time and asynchronousness still being valid). In some unscientific tests that I carried out a few weeks ago, I've brought down startup time of gPodder on the N9 down from ~ 12 seconds (using Python 2 and PySide) to ~ 3 seconds (using Python 3.3 and the lightweight "PyOtherSide" approach). No code release yet, as this is tailored towards my gPodder experiment right now and has many things hardcoded, but once I clean it up (maybe as a QML plugin) and define the API in more detail, I plan to release it.

In the mean time, here are some (relatively old) videos of PyOtherSide running with Python 3 and Qt 4 on MeeGo Harmattan, Blackberry 10 and Android (so yes, it is portable).

Donnerstag, 16. Mai 2013

Behind the Scenes: Headset Camera app for the N9

The logical step after the "Volume+ as Camera Button" app (Nokia Store link) for the N9 is another app that allows you to take photos while not touching your N9 at all. While time-triggered photos are fun, remote-triggered photos are.. erm.. "funner"? So what kind of remote "buttons" can we easily get on the N9? The remote control button on the headset is both "remote" and a "button". Also, as seen in Panucci and gPodder versions since the N900, Bluetooth headset buttons can also be queried by applications. So what do we get by combining remote control and photo taking? The Headset Camera app (Nokia Store link) for the N9! Or - for the visual reader - this:



If you want to integrate such features into your own app, the code for querying the headset buttons is readily available in the gPodder source tree (src/gpodder/qmlui/helper.py):
import dbus

class MediaButtonsHandler(QtCore.QObject):
    def __init__(self):
        QtCore.QObject.__init__(self)
        headset_path = '/org/freedesktop/Hal/devices/computer_logicaldev_input_0'
        headset_path2 = '/org/freedesktop/Hal/devices/computer_logicaldev_input'

        system_bus = dbus.SystemBus()
        system_bus.add_signal_receiver(self.handle_button, 'Condition',
                'org.freedesktop.Hal.Device', None, headset_path)
        system_bus.add_signal_receiver(self.handle_button, 'Condition',
                'org.freedesktop.Hal.Device', None, headset_path2)
 
    def handle_button(self, signal, button):
        if signal == 'ButtonPressed':
            if button in ('play-cd', 'phone'):
                self.playPressed.emit()
            elif button == 'pause-cd':
                self.pausePressed.emit()
            elif button == 'previous-song':
                self.previousPressed.emit()
            elif button == 'next-song':
                self.nextPressed.emit()

    playPressed = QtCore.Signal()
    pausePressed = QtCore.Signal()
    previousPressed = QtCore.Signal()
    nextPressed = QtCore.Signal()
MediaButtonsHandler is already a QObject subclass, so you can easily expose an instance of this class to your QDeclarativeView rootContext() and connect to the signals in QML (such a "headset button handler" might actually be a good candidate for inclusion into nemo-qml-plugins in Sailfish OS and Nemo Mobile?). As it's really just using the Python D-Bus bindings to get property changes from Hal devices, the code above should be easy (read: trivial) to port from Python to Qt/C++. Be aware that you need to connect to both .../computer_logicaldev_input_0 and .../computer_logicaldev_input, which can both exist if you have a cable headset and a Bluetooth headset connected at the same time.

You can get the Headset Camera App for the N9 in Nokia Store now, there is also a video on YouTube showing the app. Or start integrating headset button features into your own app or scripts by adapting the code above. One use case that comes to mind is using the previous/next buttons on a Bluetooth headset to control a photo slideshow on the N9 connected to TV-Out. Enjoy :)

Dienstag, 14. Mai 2013

HTML5 Web Apps on Mobile Devices

Get out your Buzzword Bingo cards, we're talking HTML5. And Canvas2D. And WebGL. See? Check them off and then continue reading. So, while writing "native" apps using JavaScript is definitely possible and works great with QML, some games are just simple enough (or want to have a broad enough audience) to warrant writing everything in HTML5.

This might also be a good time to check off XmlHttpRequest on your BB card, even if none of the following games use it - you might want to use it in your applications or game for things such as server-side stored high scores.

As far as Maemo and MeeGo is concerned, you might want to try out some of these games (especially the WebGL variant of One Whale Trip) in Fennec (aka Mobile Firefox - get it for: N900, N950/N9, Nemo Mobile).

Color Lines: This one simply uses Canvas2D, and works nicely on all mobile browsers that I tested - Maemo, MeeGo, Android, WP7.5, BB Playbook, iOS. Comes in at about 650 lines of rather well-documented JavaScript, and could easily be ported to use QML as a renderer if need be (it would be good to have a QML Plug-In that provides a JavaScript context + (a subset of) the Canvas2D API - without using WebKit (cross that off, too), that is). Also, the N900's stock browser has performance problems rendering this, while on the same device in Fennec it's quite playable.

Circle1D: This is a straight Python-to-JavaScript port of a lame 2D "Physics" Engine. It's kept very (read: very, very) simple, and collision detection could be done in a nicer way, but the inefficiency of it provides a nice benchmark for comparing JavaScript engine performance (I'm sure you can find "engine performance" on your bingo card as well) on mobile devices. The N900's default browser can't handle it at all, but Fennec can at least render/simulate it, albeit slowly.

One Whale Trip: This game actually started out as a Python game for PyWeek last September, which was also ported to the N950/N9, but as a test for trying out WebGL, I decided to port it from Python/PyGame to JavaScript/Canvas2D and then to WebGL (the Python version also contains two renderers - a "blitting" one using PyGame surfaces, and an OpenGL one using OpenGL [ES 2 on mobile]). The Canvas2D version works again in all modern mobile browsers (same as above), the WebGL only works on browsers supporting WebGL, for example Fennec/Firefox on both the N900 (even though very slowly) and not in any of the stock browsers (even not the one on the N9). As WebGL is "roughly" the same as OpenGL ES 2.x, one could imagine sharing at least shader programs for a possible C++-or-JavaScript cross-platform application.

So yeah, for smaller applications and/or games, HTML5 is definitely an option. In Firefox OS, your HTML5 web app will - also with WebGL - work and integrate nicely as "native" app. If you also want to create "native" applications (maybe after finishing the HTML5 version), consider encapsulating your JavaScript code so that you can re-use it in QML code, or (in case of WebGL apps), at least design the rendering part of your application in such a way that the code/architecture and shader programs can be shared with a C++ port of your existing HTML5 app.

Another option that's worth considering: Writing a compatiblity application layer that can load (specially-crafted) WebGL subset applications and display them on a fullscreen SDL-(or Qt)-provided window. Applications written in this WebGL subset could then be deployed on the web as HTML5 application or "natively" running on top of a JavaScript engine only. I'd call that "webglenv", and no, I haven't written it yet.

Sonntag, 12. Mai 2013

Petals for Harmattan - A pure Qt4/Qt5 JS/QML puzzle game

Next up in my list of things I did in the last weeks/months and never blogged about is Petals (Nokia Store link), a "beautiful, brain-teasing puzzle game for 1-4 players" if the game's website is to be believed (I would like to think it is...). As always, there's some technical details about the porting and creation of this game. While another recent game (Tetrepetete) has been done on a low level (C++ using no frameworks, and interfacing with multiple front-ends directly, including an OpenGL ES frontend, a console-based ncurses frontend(!) as well as a server-sent events/XHR/Canvas2D-based HTML5 frontend(!!)), this one is approaching things from a very high level: JavaScript.

Petals: A puzzle game written in pure JavaScript and QML
The gameplay logic of the game is implemented in pure JavaScript (without any QML dependencies), so it could easily be ported to, say, HTML5, but for integration reasons, QML seemed like the better choice for a release on the N9/Harmattan. Also, writing things in JavaScript wouldn't preclude a console-based frontend using nodejs and node-ncurses from happening should the need arise (making the flowers look good in ASCII art would be the challenge there - or cheating by using libcaca). Ok, ok - stop cursing, I'll stop talking about curses (cue laugh track).

Writing pure QML applications has the advantage of easing porting to Qt 5. While QtQuick 1.1 still exists on Qt 5 (and is the only QML option at the moment if you are also targetting iOS), QtQuick 2.0 is usually the better choice for performance reasons.

In my case, the changes necessary to port from QtQuick 1.1 to QtQuick 2.0 were:
  • Change "import QtQuick 1.1" to "import QtQuick 2.0" (sed(1) helps here)
  • Instead of assigning a JavaScript function to a property to create a dynamic property binding (item.someprop = function() { return otheritem.otherprop * 3.0; }), this function has to be wrapped in a call to Qt.binding() in Qt 5 (see "Creating Property Bindings from JavaScript" in the Qt 5 docs)
  • Instead of using SQL Local Storage directly as in QtQuick 1.1, use QtQuick.LocalStorage 2.0, which you can still do in your .js files - use ".import" as described in this blog post
  • In your C++ launcher (in case you need one), QApplication becomes QGuiApplication, and QDeclarativeView becomes QQuickView
  • Use "QT += quick qml" instead of "QT += declarative" in your qmake project file
And that's basically it. Of course, as this is a full-screen game with custom UI, no platform-specific components (such as Harmattan Components or Sailfish Silica) are used, so porting is a bit easier there (no need to "wait" for specific components to be compatible with QtQuick 2.0, which might realistically not happen at all for Harmattan Components). More screenshots of Petals and download links for multiple platforms can be found on the Petals Website.

Mittwoch, 8. Mai 2013

Upcoming: Billboard 1.0.9 for Nokia N9

Turns out I haven't posted here for two months, so here we go again: Billboard, your favorite low-power mode standby screen will soon receive a new update - version 1.0.9 has been uploaded to Nokia Store QA two days ago, and should hopefully pass QA and be available as an update in the next few days. This release brings a few major under-the-hood improvements and small bugfixes:
  • Fixed MeeCast icon (in 1.0.8, you can already use <<{meecast-icon-src}>>)
  • New formatter that allows you to nest {} expressions used for adding dynamic content
  • Optional image dithering (using # after the filename) for better colors in low power mode
With the new formatter, you can now output {} expressions in your scripts so that they get replaced, and similarly pass {} expressions as parameters to your scripts (for example to modify them in some way before displaying). This should allow for even more customization, some examples of what users have been doing on their N9 standby screen can be seen in the Billboard Standby Screen support thread on talk.maemo.org.

If you are looking for additional ways to tweak and enhance your Billboard-on-N9 experience, have a look at billboard-scripts, a growing collection of Shell and Python scripts that provide even more ways of customizing your standby screen.

If you haven't purchased Billboard from Nokia Store yet, you can get the current version now for your N9, and get the upgrade to 1.0.9 as soon as it's available. If you are already a happy user, watch your application updates in the next few days, and get the new version.

Dienstag, 5. März 2013

Porting Harmattan Apps to Sailfish Silica (and back)

So the Sailfish SDK was released last week, and as explained in the last blog post, gPodder is already running on Sailfish Silica Components. Of course, this has only been possible because Silica is quite similar in API design to Harmattan Qt Components (whenever I write "Harmattan" in this blog post, I usually talk about Harmattan Qt components, and whenever I write "Sailfish" it usually means "Sailfish Silica Components"). But of course porting "from" Harmattan "to" Sailfish with no way back would be kind of annoying - either Harmattan gets dropped, or somebody has to maintain two codebases, something I'd rather avoid. So, just like in "good old" Maemo 4 and Maemo 5 times, the goal here is to convert a Harmattan-only codebase to Harmattan-and-Sailfish, so that both can be maintained in the same codebase and improvements to Harmattan benefit the Sailfish port and vice versa.

In fact, while porting gPodder from Harmattan to Sailfish, it reminded me of porting from Maemo 4 to Maemo 5 - same toolkit (then Gtk+, now Qt/QML), similar extensions (then Hildon, now Components) but different concepts (then normal menus in Maemo 4 vs. HildonAppMenu in Maemo 5, now toolbars and context menus in Harmattan vs. pulley menus in Silica).

Obviously, as gPodder is written in Python here, this applies mostly to PySide-based applications, but the real difference is in the QML (there's really only one short snippet in the Python code that decides which QML import path to use). The approach taken is somewhat simliar to what the Jolla guys have been presenting at FOSDEM 2013, but I don't recall everything and have tailored the approach to better fit my workflow.

Before we start, be sure to know that qt-components is in the Sailfish Emulator (once you install it via "zypper in qt-components" as root), so that could be a shortcut for quick porting exercises and as an interim solution.

For starters, I've cleaned up gPodder's QML UI to make better use of Qt Components (historically, gPodder's QML UI was started at a time where Harmattan Qt Components were not even out or announced, so wazd and me came up with our own style - yes, that's pre-feb11 even!). So, instead of custom transitions and state management, everything is a Page, and transitions happen via the PageStack (the Silica equivalents are also called Page and PageStack). So, assuming you only have pages and so on, you could theoretically already use this or that depending on which platform you are on. Unfortunately, QML doesn't have conditional imports or something like that, so we have to add some abstraction layer.

Here's the basic idea:
  1. Create a new "Components" set - I'm calling mine "org.gpodder.qmlui"
  2. You define which items your components have, and a common API
  3. You implement your components twice: Once for Harmattan, once for Sailfish
  4. At runtime (or even compile time if you would want that) you decide which set of components you use
  5. Your application GUI code only imports QtQuick, maybe QtMultimediaKit (if you need it - it's available on both platforms) and your custom components (org.gpodder.qmlui in my case)
Just in case you are already running to the gPodder Git repo now to check it out, here's a short disclaimer: The UI isn't fully ported yet, so there's still a few remaining "com.nokia.meego" (Harmattan Qt Components) imports in my QML UI, but that's just temporary until I have found a good replacement for things like Sheet and MultiSelectionDialog in Silica.

My custom components at this time look like this:
Most of these are just thin wrappers around the "native" components on the target platform. For example, WindowWindow.qml (harmattan, sailfish) is just a wrapper for PageStackWindow (on Harmattan) and ApplicationWindow (on Sailfish). Where I said "import com.nokia.meego 1.0" and "PageStackWindow {}" previously, I now say "import org.gpodder.qmlui 1.0" and "WindowWindow {}". In the case of the Harmattan implementation, this is just the same as before, but on Sailfish, an ApplicationWindow will be used.

Other wrappers just deal with API differences. For example, ScrollDecorator exists in both Harmattan and Sailfish, but in Harmattan the property is called "flickableItem" and on Sailfish it's "flickable". So my ScrollScroll (harmattan, sailfish - also, noticed a naming pattern there? ;) takes care of hiding the differences, and I just use "flickable" on a ScrollScroll, and it will do The Right Thing on the platform it's currently running on.

When the API is the same (e.g. for Button), you still need to create pass-through components because of the different import path: Button (harmattan, sailfish).

But there's also some more elaborate differences. For example, I want to have menus - a toolbar menu on Harmattan and a pulley menu on Sailfish. So I defined a very simple Action (I'd be surprised if such a component doesn't already exist, but I didn't find one when I wasn't looking (sic)) that basically has a text and some signal attached to it. In my application code, I define actions as a property on my PagePage (which is a Page with some special code to transform actions into whatever the platform representation is). See the main page actions for an example. The ActionMenu (harmattan, sailfish) then takes care of creating a ContextMenu (Harmattan) or PullDownMenu (Sailfish) and PagePage (harmattan, sailfish) takes care of creating a toolbar on Harmattan and forwarding a reference to the listview to which the PullDownMenu should be attached to (in Harmattan, I still need to pass the listview there, but it's not used).

The same customization also happens for the ListList (harmattan, sailfish), which is a ListView (Harmattan) or SilicaListView (Sailfish), but also takes care of displaying a header (which still needs to be styled correctly on Sailfish) if the platform requires it.

Once you've created your custom set of components, you have to create a qmldir file that lists the components available and in which version of the components they are available. Read up on Declarative Modules in the official documentation.

So there you have it. One QML codebase, two UX targets that are well-integrated. You can see the results in the just-released gPodder 3.5.0, which is more Harmattan-ish on the N9 than before (although some items have moved from the toolbar into the menu for simplicity reasons, and I actually think it's cleaner now) and also looks and feels quite native on Sailfish (it's not done yet, and I'm sure the UI will evolve and adapt once Silica gets more mature and we see more Sailfish apps).

What I'd like to see in the Sailfish SDK: Different Ambiance backgrounds (darker, brighter, different hues) so that developers like me can test if their apps look good atop more than just the nice blue default background.

As a last hint (I couldn't find that in the API docs, but in the component gallery): Sailfish Silica's Label Component has a "truncationMode" property. If you set it to "TruncationMode.Fade", you never want to see elided text again, because it looks so sexy! :)

Samstag, 2. März 2013

How to try out gPodder in the Sailfish OS Emulator

You might have seen this one coming: gPodder is already working on Sailfish OS. If you want to try it out in the emulator yourself (no MP3 playback due to missing codecs, and some parts of the UI have not yet been ported), install the Sailfish SDK and start the emulator (thanks to the interpreted'ness of Python, we don't have to care about cross-compiling at this point). Then, SSH into the emulator as user "nemo" (I'm purposefully vague here - if you can't figure out how to SSH into the emulator, then you probably shouldn't be trying it out at this point).

From the "nemo" user, become root (use "su -", root password is "nemo") and then install some dependencies:

zypper in python-pyside git qt-components

With that in place, go back to the "nemo" user and get gPodder from Git:

git clone git://github.com/gpodder/gpodder.git

Then, cd into the Git checkout and start it as usual:

cd gpodder
python tools/localdepends.py
bin/gpodder

Again, you don't have to do any installation or compilation steps for gPodder - it will work straight out of a Git checkout (that's how I use it all the time). If you "export" the Emulator as appliance in VirtualBox and then "import" it on a different machine, you can even work with this nicely on Mac OS X and Windows. The fact that the emulator is just another Mer installation also means that you can install a compiler and -devel packages for quick development and testing. Vim 7.3 is already installed, I only wish zsh was also available in the Mer repos :)

Dienstag, 19. Februar 2013

MP3 playback in Nemo Mobile on the N950

If you've been playing with Nemo Mobile on your N950 recently, and wanted to do more than just swipe around the Lipstick UI, you might have noticed that while there's a music player app, it can't playback MP3 files (OGG files seem to work fine). This is a quick'n'dirty log of what I had to do to get MP3s playing (I've checked in the repos for something similar, but couldn't find it):

First, install the Mer Platform SDK:
https://wiki.merproject.org/wiki/Platform_SDK

Then, get SB2 (for armv7hl, as this is what Nemo-on-N950 uses):
https://wiki.merproject.org/wiki/Platform_SDK#Compiling_with_the_SDK

Then, set everything up so you can use "nemo-n950" as target with sb2:
https://wiki.merproject.org/wiki/Platform_SDK_and_SB2

Install build dependencies (you might need more than these, use "zypper se " to search for package names):
sb2 -t nemo-n950 -m sdk-install -R zypper in gstreamer-devel gst-plugins-base-devel gst-plugins-bad-free-devel gstreamer-tools orc-devel zlib-devel

Get the gst-ffmpeg sources (use version 0.10.11, due to bug 655238):
http://gstreamer.freedesktop.org/src/gst-ffmpeg/

Extract the sources, then do:
sb2 -t nemo-n950 ./configure --prefix=/usr
sb2 -t nemo-n950 make
mkdir tmp
DESTDIR=$(pwd)/tmp/ sb2 -t nemo-n950 make install
cd tmp/
scp -r . root@192.168.2.15:/

The last step obviously assumes that your device is connected and USB networking is properly set up. And then we hear somebody say "Well, but why not package it properly?". Ok. Take this modified gst-ffmpeg.spec file (based on gst-ffmpeg.spec already included in the sources). Then build a package using:

mb build -t nemo-n950 gst-ffmpeg.spec

This will leave you with gst-ffmpeg-0.10.11-1.armv7hl.rpm in ~/rpmbuild/RPMS/armv7hl which you can then scp and rpm -i to your device.

Donnerstag, 7. Februar 2013

FOSDEM 2013

I attended FOSDEM 2013 in Brussels, Belgium last weekend. It was my first FOSDEM, and as such, I was quite positively surprised about the location/setting (big University with too many rooms and tracks to visit them all) and reach of the event (open source projects from every different corner you can imagine).

It was also nice to catch up with old friends from Maemo/MeeGo times like Quim, but at the same time it was impossible to say hi to everybody, as the location is so big, the event only lasts two days and everything is quite hectic and crowded. Still, I managed to meet community celebrities like rzr (of Harmattan Community Repository fame) and e-yes (of Nitdroid-on-N9 fame) in person at the event, which was really nice.


Apart from meeting people and having a good time in Downtown Belgium (so many beers to choose from), Jolla Mobile was also present at the event, and I managed to attend two talks (QML App Development and Porting Nemo to new Hardware), where I found out about Sailfish.Silica 1.0 (Jolla's Own Version of Qt Components), and Open Source Components of Jolla (really good to see Sailfish Silica open source'd and also good to see maliit and contextkit used as middleware).


The photos of the weekend, including a quick sightseeing tour on Monday as well as your usual dose of food porn can be found on Flickr.

Dienstag, 29. Januar 2013

Tetrepetete, SMS Backup and apkenv updates

Whew, January is pretty much over already, and there haven't been any updates here. So this means here's a short cumulative update of what's happened over the last few weeks:

Tetrepetete
If you are not afraid of falling blocks, and don't mind the lack of colors, give Tetrepetete a try. Not to be confused with a game of a totally different name, this game brings falling blocks of 4 to your N900 and N950/N9. While the full color version is still not available as such, the free greyscale version is up for grabs as .deb on the website. Including a cameo appearance by That Rabbit from That Rabbit Game for no good reason. Play or discuss. Or be productive. Or something.

SMS Backup GUI
Instead of rolling my own solution from scratch, I found the wonderful MeegoSMSBackupRestore project by Tony Wang, which is a command-line tool for Harmattan devices to backup and restore SMS messages (as if that wasn't clear from the application name..). It was missing a GUI, so I've added one. The announcement and download, as well as the Debian source package (.dsc, .tar.gz) are available from an Internet near you. Feel like Git? We have you covered as well.

apkenv updates
If you haven't been watching the apkenv Git repo closely (I know you haven't), you might have missed the Pandora port by crowriot, which - while not being directly useful to Maemo/MeeGo users as such - also brought improvements to the Cut the Rope module, which somehow works now. To be discussed at talk.maemo.org, and patches (as always) welcome.

In other (totally unrelated) news, the thesis is done (yay!).