Subscribe to Planet KDE feed
Planet KDE - http://planetKDE.org/
Updated: 10 min 59 sec ago

Attaching debugger and ptrace_scope

Wed, 07/15/2015 - 18:55

In Fedora 22, if you try to attach debugger to a running process, even if by the same user, gdb will politely refuse with error message:

ptrace: Operation not permitted.

The reason is a newly enabled security feature YAMA to specifically restrict inspecting memory of other programs. See the RH bug 1196825 for original discussion. This information is yet to reflect on the Fedora security features matrix.  On why this restriction is a good thing, read the Linux kernel documentation. Note that this restriction doesn’t affect program started by debugger, such as “gdb myprogram“. To enable debugging running programs, as root do:

echo 0 > /proc/sys/kernel/yama/ptrace_scope To enable that permanently, do: echo kernel.yama.ptrace_scope = 0 > /etc/sysctl.d/10-ptrace.conf
Tagged: fedora, hacking

Ubuntu Policy Complies With GPL But Fails To Address Other Important Software Freedom Issues

Wed, 07/15/2015 - 16:53

Today Canonical published an update to their IP Policy by adding in a “trump clause” paragraph saying that “where any other licence grants rights, this policy does not modify or reduce those rights under those licences”.

I’ve been going on about this IP Policy (which is Canonical’s but confusingly appears on the Ubuntu website) and how it was incompatible with the Ubuntu policy for years.  I’ve been given a lot of grief for querying it having been called a fucking idiot, aggressiveswearer of oaths and disingenuous, dishonest, untrustworthy and unappreciative.  It really shows Ubuntu at its worst, and is really amazing that such insults should come from the body which should be there to prevent them. And I’ve heard from numerous other people who have left the project over the years because of similar treatment.  So it’s nice to see both the FSF and the SFC put out statements today saying there were indeed problems, but sad to see they say there still are.

Canonical, Ltd.’s original policy required that redistributors needed to recompile the source code to create [their] own binaries” says SFC, and “the FSF, after receiving numerous complaints from the free software community, brought serious problems with the policy to Canonical’s attention“.  Adding the trump clause makes any danger of outright violation go away. 

But as they both say there’s still dangers of it being non free by restricting non-GPL code and using patents and trademarks.  The good news is that doesn’t happen, the Ubuntu policy forbids it and there’s a team of crack archive admins to make sure everything in the archive can be freely shared, copied and modified.  But the worry still exists for people who trust corporate sayings over community policy.  It’s why the SFC still says “Therefore, Conservancy encourages Canonical, Ltd. to make the many changes and improvements to their policy recommended during the FSF-led negotiations with them” and the FSF say “we hope they will further revise the policy so that users, to the greatest extent possible, know their rights in advance rather than having to inquire about them or negotiate them“.  Well we can but hope but if it took two years and a lot of insults to get a simple clarifying paragraph added and stuff like this happen “After a few months working on this matter, Conservancy discovered that the FSF was also working on the issue” (did nobody think to tell them?), I don’t see much progress happening in future.

Meanwhile the Ubuntu Developer Membership Board wonders why nobody wants to become a developer any more and refuses to put two and two together.  I hope Ubuntu can re-find it’s community focus again, but from today’s announcement all I can take from it is that the issues I spoke about were real concerns, even if no more than that, and they haven’t gone away.

facebooktwittergoogle_pluslinkedinby feather

KDevelop: Concentration as a feature

Wed, 07/15/2015 - 08:28

One of the things I’ve heard on every KDevelop sprint is: we should be light, like Kate but with our added value.

We’ve addressed this in many ways so far: we’ve optimized the code for performance so it’s more responsive and starts reasonably fast, we’ve made sure most done is accessible using the keyboard so we don’t feel clumsy and overwhelmed by all the options.

Today, I wanted to reflect on 2 things:

  • What do we miss to be that ideal light editor?
  • What’s a good lightness inspiration?

Otherwise, TLDR, there’s a video below.

What do we miss?

The avid Planet KDE reader will know that being light is a complex concept, there’s many ways to perceive lightness: Is Kate light? Is KWrite light?

When we talk about lightness, we generally refer to different metrics. If something is worth waiting for, we wait, and that’s fine. We don’t want to feel we’re wasting our time. A good example there I’d say it’s Chromium. It’s probably one of the heaviest beasts we run on our systems. Nevertheless, it doesn’t feel as such (at least until you run out of memory).

There’s another point of view: We bomb users with features. In fact, it’s awesome to have features and it shouldn’t be a trade-off. On the other hand we’re never using all features at the same time, optimizing that would be awesome. We should work on it: identifying the different workflows and creating user interfaces that enable them.

What’s the role model?

One of the developments that have struck me the most during last years is Kate. Instead of focusing on the editor, it went the KDevelop route: it has started to offer all of the information at once (especially odd, given that there’s quite some feature overlapping).

More generally, if I look at what others are doing, I see two major groups:

On one hand, there’s atom.io and sublime that seem to be doing something very similar. Most of the components they have we have as well, but then the focus is rather different: they have very few visual interaction components, mostly just the menu, so you can just play with the text and you know where to go look for stuff. UI’s are usually embedded in the source code view.

On the other hand, there’s Eclipse or Visual Studio that do something quite similar to what we do: present everything in the traditional desktop application way with docks and a couple of smart tricks, because they try to tackle the whole workflow problem.

I understand how we got to where we are, but I also really understand why people can prefer something like atom. My feeling is that “It doesn’t make me feel like I should be debugging while I’m coding”, even though they oversimplify on some areas.

What do I propose?

I decided that I wanted to feel what’s to work without all the clutter, so I implemented a Concentration Mode for KDevelop. I stripped most of the visual stress, so usually we’re left with the good ol’ katepart editor, with all KDevelop‘s features. I’ll become my own guinea pig: now that I have it clean, how do I use the tools? Can I still do the same things? As discussed on the mailing list, this has had some fruits already.

I think it’s really cool: code, code, code, quick open and shortcuts.

The Video!

Now, a bit of a video implementing what I discussed here.

What to look for:

  • We get to hide the navigation handle, leaving only the toolbar, the document title and the central view.
  • The toolviews are still accessible.
  • The menu is still accessible
  • For now, I can decide to just see what I’m working on. Now.

Happy hacking!

thoughts on being merciful binary gods

Wed, 07/15/2015 - 00:28

“Since when has the world of computer software design been about what people want? This is a simple question of evolution. The day is quickly coming when every knee will bow down to a silicon fist, and you will all beg your binary gods for mercy.” Bill Gates

For the sake of the users, let’s assume Bill was either wrong or (||) sarcastic.

Let’s say that we want to deliver Freedom and privacy to the users and that we want to be more effective at that. We plan to do that through quality software products and communication — that’s how we reach new users and keep them loving our software.

We can’t get away with half-assed software that more or less always shows clear signs of “in progress”, we need to think our software through from a users point of view and then build the software accordingly. We need to present our work at eye-level with commercial software vendors, it needs to be clear that we’re producing software fully reliable on a professional level. Our planning, implementation, quality and deployment processes need to be geared towards this same goal.

We need processes that allow us to deliver fixes to users within days, if not hours. Currently in most end-user scenario, it often takes months and perhaps even a dist-upgrade for a fix for a functional problem with our software.

The fun of all this lies in a more rewarding experience of making successful software, and learning to work together across the whole stack (including communication) to work together on this goal.

So, with these objectives in mind, where do we go from here? The answer is of course that we’re already underway, not at a very fast speed, but many of us have good understanding of many of the above structural goals and found solutions that work well.

Take tighter and more complete quality control, being at the heart of the implementation, as an example. We have adopted better review processes, more unit testing, more real-world testing and better feedback cycles with the community, especially the KDE Frameworks and Plasma stacks are well maintained and stabilized at high speeds. We can clearly say that the Frameworks idea worked very well technically but also from an organizational point of view, we have spread the maintainership over many more shoulders, and have been able to vastly simplify the deployment model (away from x.y.z releases). This works out because we test especially the Frameworks automatically and rather thoroughly through our CI systems. Within one year of Frameworks 5, our core software layer has settled into a nice pace of stable incremental development.

On the user interaction side, the past years have accompanied our interaction designers with visual artists. This is clearly visible when comparing Plasma 4 to Plasma 5. We have help from a very active group of visual designers now for about one and a half year, but have also adopted stricter visual guidelines in our development process and forward-thinking UI and user interaction design. These improvements in our processes have not just popped up, they are the result of a cultural shift towards opening the KDE also to non-coding contributors, and creating an atmosphere where designers feel welcome and where they can work productively in tandem with developers on a common goal. Again, this shows in many big and small usability, workflow and consistency improvements all over our software.

To strengthen the above processes and plug the missing holes in the big picture to make great products, we have to ask ourselves the right questions and then come up with solutions. Many of them will not be rocket science, some may take a lot of effort by many. This should not hold us back, as a commonly shared direction and goal is needed anyway, regardless of ability to move. We need to be more flexible, and we need to be able to move swiftly on different fronts. Long-standing communities such as KDE can sometimes feel to have the momentum of an ocean liner, which may be comfortable but takes ages to move, while it really should have the velocity, speed and navigational capabilities of a zodiak.

By design, Free Culture communities such as ours can operate more efficiently (through sharing and common ownership) than commercial players (who are restricted, but also boosted by market demands), so in principle, we should be able to offer competitive solutions promoting Freedom and privacy.

Our users need merciful binary source code gods and deserve top-notch silicon fists.

KDevelop Checker Framework – pushed

Tue, 07/14/2015 - 17:39

Hi there!

I’m pleased to announce that the KDevelop Checker Framework has been pushed to the KDevPlatform repository. Here are some details about it:

GUI changes

  • Moved ProblemModel to shell
  • Reworked the Problems toolview. Now it works like this:
    • ProblemModels are added to ProblemModelSet.
    • ProblemReporterFactory makes instances of ProblemsView.
    • ProblemsView takes the models from ProblemModelSet (also subscribes for updates about them, so if one is added or removed it can add/remove their views) and it provides a tabbed widget where the views for them can be added. It creates instances of ProblemTreeView which show the problems in ProblemModel, and adds them to the tabs. Also the tabs shows the number of problems in the ProblemModels.
    • The toolview will only add actions that are supported by the model (for example: filtering, grouping, reparsing, showing imports. Obviously reparsing doesn’t make sense for runtime problem checkers)

See the video:

  • First it shows that the “old” problem reporter still works as intended (which also uses the new code now)
  • Then from 1:07 onward it shows an example problem model/view working with randomly generated test data.
  • It shows the features of the new model(s), that is filtering by files/project and issue severity.
  • It also shows the grouping support (grouping by severity, and path.

ProblemModel details

  • Broke up ProblemModel into 2 parts
    • Base ProblemModel that provides the QAbstractItemModel interface for views and can use various ProblemStores to store problems. By default it uses FilteredProblemStore.
    • ProblemReporterModel is basically the old ProblemModel that grabs problems from DUChain, it’s a subclass of ProblemModel.
  • ProblemStore simply stores problems as a list (well technically it stores them in a tree, but it only has 1 level, so it’s a list). There’s no filtering, no grouping. It’s perfect for ProblemReporterModel since it does filtering itself when grabbing the problems from DUChain.
  • FilteredProblemStore DOES filtering, and grouping itself. It stores problems in a tree (ProblemStoreNode subclasses). The tree structure depends on the grouping method, which is implemented with GroupingStrategy subclasses.
  • Moved WatchedDocumentSet and it’s subclasses from ProblemModel to ProblemStore, as it is really a detail that the model itself doesn’t need, however ProblemStore which stores the problems needs it actually.
  • Created a new Problem class, DetectedProblem and moved both this and the “old” Problem class in under the IProblem interface. The intent here was to create a class with a clear interface for problems, which ProblemStore can simply store. I wanted to eventually clear the problems out of DUChain and replace the “old” Problem class with it. However I realized that it’s not practical because of the “show imports” feature which shows the problems from imported contexts. Unfortunately DUChain is the class that knows those, and it’s way too much work to get it out from it. Not to mention it doesn’t even make sense, since it’s really something that logically belongs there.

Using this new system is fairly straightforward:

All one has to do is instantiate a model, add it to the model set:

KDevelop::ILanguageController *lc =  KDevelop::ICore::self()->languageController();
KDevelop::ProblemModelSet *pms = lc->problemModelSet();
m_model = new KDevelop::ProblemModel(this);
pms->addModel(“Test”, m_model);

Then later inject problems into it:

KDevelop::DetectedProblem *p = new KDevelop::DetectedProblem();
p->setDescription(“Some message”);
p->setFinalLocation(KDevelop::IndexedString(“/just/a/bogus/path/yada.cpp”));
p->setSource(KDevelop::IProblem::Plugin);
p->setSeverity(KDevelop::IProblem::Error);
model->addProblem(KDevelop::IProblem::Ptr(p));

Here’s a class diagram about the relevant classes:

20150714_000002956


kdev-krazy2 ported to KF5

Tue, 07/14/2015 - 16:26

Good news everyone!

The KDevelop frontend for Krazy tools has been ported to KF5, so it now works with the KF5 version of KDevelop.

20150714_000002954


Compilation Copyright Irrelevant for Kubuntu

Tue, 07/14/2015 - 14:24

Joel Leclerc’s recent post The importance of freedom in software reminds us that the reason we contribute to projects like Ubuntu is that they it is made for sharing. Use it, modify it, improve it, share it. Anywhere, any time and with any number of people all over the world. No licence required.  Take that away and you take away the reason for people to contribute.

Recent comments by a CC member that our ability to modify, improve and share it might be restricted by compilation copyright are a dangerous threat to our community.  It’s the sort of thing the Community Council should be there do take a stand against, but alas no.

Compilation Copyright?

Compilation copyright is an idea exclusive to the US (or North America anyway).  It restricts collections of items which otherwise have unrelated copyright restrictions.  A classic example is a book collection of poetry where the poems are all out of copyright but the selection and ordering of poems is new and has copyright owned by whoever did it.

It’s completely irrelevant outside the US where most of the world is located but we like to look after everyone so what’s the situation for people in the US?

Kubuntu images are made from lists of packages in seed files which get made into meta packages.  You could indeed argue that this meta package is subject to compilation copyright, I couldn’t find any case law on it so I suspect it’s entirely undefined.  The good news is the meta package has always been GPL 2 licenced so voila, no copyright restrictions beyond the norms of free software.

The seed respoitory has curiously lacked a licence until I added the GPL the other day.  It has a number of copyright holders primarily me (from before and after I worked for Canonical) and Canonical (from when I did).  Anything on Launchpad has to be free software so we can safely say the same applies here. And the seed isn’t what’s distributed on the images, the meta package is.

And of course it’s easy to replicate, the list of packages is just those that come from KDE for the most part so you can argue any compilation copyright is KDE’s, which in the case of Plasma is me again as release dude.  And I pick GPL.

And in case anyone is getting confused, this has nothing to do with GCC style compilers, running some code through a compiler makes no difference whatsoever to what copyrights apply to it and nobody has ever seriously said anything different unless they’re trying to muddy the waters.  I recently had Mark Shuttleworth say that of course you could copy individual binaries from Ubuntu.

But but… you’re not a lawyer

It’s too complex for you…you’re too small and too wee and you need those weapons of mass destruction to prevent terrorism… was the self-deprecating argument the unionist politicians came up with for voting no to Scottish independence.  It worked too, for now, amazing.

Similarly I’m amazed at how otherwise very intelligent free software geeks look down on their ability to understand copyright and other laws.  If we took this attitude to coding I’d never have started contributing to KDE and I’d never learn what real coding is like.  If you want to know about an area of law it’s the same as coding, you read some textbooks, read some acts of parliament, read some EU directives, read some case law and come up with a decision.  It’s exactly what judges do when they make a decision, no different.

Based on the above I have maintained the KDE licence policy and reviewed thousands of packages into the Ubuntu archives. So I feel equally competent to make the obvious declaration that compilation copyright has no relevant to Kubuntu because we freely licence the meta package.  Remember geeks you are strong and free, don’t let anyone talk you down with unspecified scaremongering like “things get even more complicated” if they can’t say what makes it complicated then you can safely ignore it.

 

facebooktwittergoogle_pluslinkedinby feather

OpenStack Summit Tokyo: Call for Speakers

Tue, 07/14/2015 - 11:24
The next OpenStack Summit will take place in Tokyo, Japan from 27-30 October 2015. The Call for Speaker period is open since some days and will close on July 15th, 2015, 11:59 PM PDT (July 16th, 08:59 CEST).
You can submit your presentations here. I myself work currently on a proposal to speak about OpenStack and Ceph HA aspects.

Time to gear up!!!

Tue, 07/14/2015 - 08:12
Mid term evaluations are over and its time to gear up.
Here is the work done this week
  • Layers feature:- 
    • Separate search options for the tree view:-  By mistake, I used the same search options that were used for contents tree view. Now, I have separated the search options for layers tree.
    • Reloading document in the right order:- Initially on changing the check state of layers tree, the whole document was reloaded, but now the pages are reloaded starting from the current page. Later if the time permits, I will try to introduce somehow page number in the poppler opt-content model, which can be used to refresh only the pages affected.
    • Hide layers button from sidebar when not present :- Not many of the PDF have layers in them, so layers are hided completely when they are not present in the document.
    • Currently, I am working on toggling of forms widget also when check state is changed.
  • Linearization feature:- The KIO::open job does not support HTTP protocol and PDF and its linearization are mostly used with HTTP protocol. So, Now I am working on implementing it through KIO::get job.
  • Tags feature:- The back-end support for tags model in poppler-qt is introduced. Currently, I am working on deciding the things we need to display in the tree and making the code commit ready.

KDE Applications 15.04.3 available

Mon, 07/13/2015 - 18:14


KDE's third update of its 15.04 series of Applications is now available in Chakra. With this release kde-workspace has also been updated to version 4.11.21 and kdelibs and kdepim to 4.14.10. Have in mind that the applications that have been ported to Frameworks 5 will not be updated in the stable repositories but remain at their previous versions. The new versions of these packages are available in the [kde-next] repository which provides Plasma 5.

If you want to test Plasma 5 under Chakra, you can follow the instructions on the forum. Feel free to let us know of your comments and feedback on the related thread.

In addition, the following notable updates are now available with this move:
-calligra 2.9.6
-firefox 39.0
-thunderbird 38.1.0
-wine 1.7.46
-pip 7.0.3

It should be safe to answer yes to any replacement question by Pacman. If in doubt or if you face another issue, please ask or report it on the related forum section.

As always, make sure your mirror is fully synced (at least for the core, desktop and platform repositories) before performing this update, by running the mirror-check application.

Vacation 2015

Mon, 07/13/2015 - 16:33

IMG_20150703_172538So, vacation has finally arrived in 2015. To the despair of some, and joy of others, the Swedish standard vacation is 3-5 weeks over the summer. I’ll be having five weeks of this year.

So, what do you do with five weeks? Generally, I start out ambitious and end up in reality after 3-4 weeks and then scramble to get something done. Here is my list for the summer 2015:

  • Hang out with the kids and wife and do fun stuff.
  • Do some work around the house (a shed for our bikes and some general painting are on the wish list).
  • Get the calendar for foss-gbg into place. It does look as if we will have an awesome autumn.
  • Work on a whole bunch of little Yocto projects that I’ve been planning for a while (meta-kf5 being one of the high priority ones, playing with various targets such as the Raspberry Pi 2, Beagle Bone Black and more another).
  • Getting my 3D printer back into shape and do something fun with it (if it rains a lot)

That summarizes my ambition pretty much – but first things first – lets spend a few days hanging out with the kids before they grow tired of me and think that I’m old and boring :-)

KDEPIM report (week 27)

Mon, 07/13/2015 - 06:30

Last week I decided to test and debug KLeopatra and KNotes:

KLeopatra and KNotes:

I cleaned some code, and I ported some part of code.

But KNotes needs more love. The code is a very old code (from Qt3 version) and it needs to adapt. It will take some times.

I fixed of course some bugs.

Other works on KDEPIM;

I worked a lot on kcontact, the library used by kaddressbook to generate the VCard file for example.

I decided to finish the vcard4 support.

Now we have IMPP support (We can’t still use in kaddressbook but IMPP used by Ownclound for example should not be lost when we sync resources)

I added a lot of autotest to make sure that all vcard features can be import/export.

Another work that I started last Friday is using QVector instead of QList in kdepimlibs (It’s an optimization). I did it before that API is frozen.

As usual I fixed a lot of porting bugs.

Future:

This week I will continue to work on KLeopatra and KNotes, but I will test korganizer too.

KStars – A new look!

Sun, 07/12/2015 - 21:09

With more than a month remaining, my Google Summer of Code project has almost come to an end! KStars is now able to display all 88 western constellations. I want to thank my mentor, Jasem, for helping me by making a dialog box that would display constellation images with parameters in real time. This made my job simpler by a very large margin. Instead of using mathematical equations to figure out the ‘best fit’ for an image, I have simplified the task by simply noting down the RA/DEC coordinate pairs for ‘midpoints’ of constellations from Stellarium. This helped me figure out ‘where’ to translate the image in the sky map. Then I played around with the position angle, width and height for each image so as to ‘best fit’ the constellation lines. I had to do this repetitively for 88 consecutive times, but this was still a much simpler solution. Lastly I replaced all the 88 images with transparent backgrounds, so as to avoid cutting neighbouring images by the black background which was previously present. KStars looks good now, and I feel happy seeing the results! Here’s the new look of KStars!

 

kstars

 

Next up is to do this again for one non-western sky culture, and learn Doxygen to start writing documentation for the code that I have written! I aim to finish everything up within 10 days time now! A lot of help from my mentor, the KStars community, and an amazing project has made Google summer of code a great experience for me!

Marble 1.11.3 for Windows

Sun, 07/12/2015 - 14:49

In the last months I did not have any system running Windows and therefore could not create new Marble Windows packages. My new T450s however came with a preinstalled Windows 7, so that problem is gone. If you are running Windows, please give the new packages a try:

Please leave a comment whether they work for you. I’ll add the download links to marble.kde.org in that case. Compared to the last Windows packages (Marble 1.9.1) there’s an upgrade to Qt 5.5 inside and several new features in Marble itself, e.g. improved support for tours and map editing (the treasure map in the screenshot was done with that) as well as a couple of new map projections.

2015-07-12_16h22_41

DIY Net Player

Sun, 07/12/2015 - 14:21

Something I wanted to share is a device I recently built. It is a complete net player for music files, served by a NAS. It’s now finished and “in production” now, so here is my report.
The cherry ear net player, based on raspberry pi and HifiBerry Amp+.

The cherry ear net player, based on raspberry pi and HifiBerry Amp+.

Hardware

The device is based on a Raspberry Model B+ with a HifiBerry Amp+. Cherry Ear backside with connectors and power supply

Cherry Ear backside with connectors and power supply

The Amp+ is a high-quality power amplifier that is mounted on the Raspberry mini computer, shortcutting the sub optimal audio system of the raspy. Only loudspeakers need to be connected, and this little combination is a very capable stereo audio system.

Software: Volumio

On the Raspberry runs a specialised linux distribution with a web based audio player for audio files and web radio. The name of this project is Volumio and it is one of my most favorite projects currently. What I like with it is that it is only a music player, and not a project that tries to manage all kind of media. Volumios user interface is accessible via web browser, and it is very user friendly, modern, pretty and clean.

It works on all size factors (from mobile phone to desktop) and is very easy to handle. Its a web interface to mpd which runs on the raspi for the actual music indexing and playing.

Volumio User Interface Example

Volumio User Interface

The distribution all is running on is an optimized raspbian, with a variety of kernel drivers for audio output devices. Everything is pre configured. Once the volumio image is written to SD-Card, the Raspberry boots up and the device starts to play nicely basically after network and media source was configured through the web interface.

It is impressive how the Volumio project aims for absolute simplicity in use, but also allows to play around with a whole lot of interesting settings. A lot can be, but very little has to be configured.

Bottomline: Volumio is really a interesting project which you wanna try if you’re interested in these things.

The Housing

I built the housing out of cherry tree from the region here. I got it from friends growing cherries, and they gave a few very nice shelfs. It was sliced and planed to 10mm thickness. The dark inlay is teak which I got from 70’s furniture that was found on bulky waste years ago.

After having everything cut, the cherry wood was glued together, with some internal help construction from plywood. After the sanding and polishing, the box was done.

Sizes

Sizes [cm]

The Raspberry and Hifiberry are mounted on a little construction attached to the metal back cover, together with the speaker connector. The metal cover is tightend with one screw, and the whole electronics can be taken off the wood box by unscrewing it.

At the bottom of the device there is a switching power supply that provides the energy for the player.

How to Operate?

The player is comletely operated from the web interface. The make all additional knobs and stuff superflous, the user even uses the tablet or notebook to adjust the volume. And since the device is never switched off, it does not even have a power button.

I combined it with Denon SC-M39 speakers. The music files come from a consumer NAS in the home network. The Raspberry is very well powerful enough for the software, and the Hifiberry device is surprisingly powerful and clean. The sound is very nice. Clear and fresh in the mid and heights, but still enough base, which never is annoying blurry or drones.

I am very happy about the result of this little project. Hope you like it :-)


Translating Haskell to C++ metaprogramming

Sun, 07/12/2015 - 00:00

Haskell and C++ are very different programming languages. Haskell is purely functional and C++ is imperative. While 'normal' C++ is imperative, C++ metaprogramming is purely functional, just like Haskell. This blog shows examples of Haskell code that I've translated to C++.

Why would you do this?

Last week I blogged about Blasien. Blasien is a set of header files for writing literal validated XML in C++. XML can be validated against DTD, XML schema or Relax NG. Relax NG is of interest to me since that schema language is used in the Opendocument Format standard. Relax NG is an elegant schema language, but validating against it is not trivial. The goal with Blasien is to do this validation at compile time.

James Clark has written an elegant algorithm for Relax NG validation. His algorithm is written in a subset of Haskell. In the past, I've ported his algoritm to JavaScript for use in WebODF. For use in Blasien, I've now ported it to C++ metaprogramming. This took some puzzling, but the result feels very natural once you get used to the syntax.

A simple example

Haskell is a very clean programming language. It is very different from C++. Haskell is a purely functional programming language and so is C++ metaprogramming. Purely functional means that all structures are immutable and functions have no side-effects. All a function does is to take immutable structures as input and create a new immutable structure as output. The same is true in C++ metaprogramming.

Here is an example of a function that adds two integers.

Haskell C++ add :: Int -> Int -> Int add a b = a + b main :: IO () main = do let r = add 1 2 putStrLn $ show r template <int a, int b> struct add { static constexpr int value = a + b; }; int main() { auto r = add<1,2>::value; std::cout << r << std::endl; return 0; }

Both of these examples have a main function which prints out a value. That part of the examples is not pure: printing out a value is a side-effect. But the calculation of the value is pure. In C++ the value of r will be calculated at compile time. (Probably in Haskell too.)

Data types

Haskell has algebraic data types. In the code fragment below, NameClass is a data type with four variants: AnyName, QName, NsName or NameClassChoice. Instances of AnyName have no members. Instances of QName have two members of type Uri and LocalName respectively. The members are not named but can be accessed via pattern matching. This will be explained below.

In C++, the variants of NameClass are not connected directly. Each of them is defined as a separate struct. The members of each data type are given via template parameters. We are using structs everywhere now: both functions and data types are defined via template structs.

Haskell C++ type Uri = String type LocalName = String type NameClass = AnyName | QName Uri LocalName | NsName Uri | NameClassChoice NameClass NameClass using String = const char*; constexpr char emptyString[] = ""; using Uri = String; using LocalName = String; struct AnyName; template <String U, String L> struct QName { static constexpr String Uri = U; static constexpr String LocalName = L; }; template <String U> struct NsName { static constexpr String Uri = U; }; template <typename NC1, typename NC2> struct NameClassChoice { using NameClass1 = NC1; using NameClass2 = NC2; };

The C++ definition of QName differs in an important aspect from the definition of NameClassChoice. QName has static constexpr String in front of its members and NameClassChoice has using before its members. This is because C++ makes a distinction between type parameters and value parameters. QName has two String values as parameters and NameClassChoice has two NameClass types as parameters.

Note that AnyName does not have a definition. It has no members and therefor no need for a definition. All it needs is a declaration. C++ metaprogramming is programming with types. None of these types will be instantiated, so they do not require a complete definition.

Function overloading with pattern matching and template specialization

Haskell does not have function overloading. Instead it has pattern matching (and type classes). Only one function with a particlar name is possible. That function can still handle different types of input if the input parameters are algebraic data types. A different implementation is possible for each variant of an algebraic data type.

This is demonstrated in the function contains. The first parameter to this function should be of type NameClass. There are four variants of this data type and hence four implementations are possible. The function contains returns true if the NameClass instance contains the given QName. The members of each variant of NameClass are exposed via pattern matching.

_ is a wildcard and is used in the first variant AnyName. Since AnyName matches any QName instance, the value of QName is not bound to a variable. The implementations of the QName and NsName variants are very straightforward. The implementation of the NameClassChoice variant is recursive.

C++ metaprogramming also uses pattern matching. In C++ this is called template specialization. First a template struct is defined. This may or may not have a definition. Next, a specialization is written for each of the variants of the algebraic data type.

Haskell C++ contains :: NameClass -> QName -> Bool contains AnyName _ = True contains (QName ns1 ln1) (QName ns2 ln2) = (ns1 == ns2) && (ln1 == ln2) contains (NsName ns1) (QName ns2 _) = (ns1 == ns2) contains (NameClassChoice nc1 nc2) n = (contains nc1 n) || (contains nc2 n) template <typename NameClass, typename QName> struct contains; template <String U, String L> struct contains<AnyName, QName<U,L>> { static constexpr bool value = true; }; template <String U1, String L1, String U2, String L2> struct contains<QName<U1, L1>, QName<U2, L2>> { static constexpr bool value = strcmp(U1, U2) == 0 && strcmp(L1, L2) == 0; }; template <String U1, String U2, String L2> struct contains<NsName<U1>,QName<U2,L2>> { static constexpr bool value = strcmp(U1, U2) == 0; }; template <typename NameClass1, typename NameClass2,String U2, String L2> struct contains<NameClassChoice<NameClass1,NameClass2>,QName<U2,L2>> { static constexpr bool value = contains<NameClass1,QName<U2,L2>>::value || contains<NameClass2,QName<U2,L2>>::value; }; Running unit tests at compile time

Some people prefer type checking. Others choose unit tests. Having both is best. A neat feature of C++ metaprogramming is that it is possible to write unit test that run at compile time. Here are some tests for our new contains function.

constexpr char xhtmlNS[] = "http://www.w3.org/1999/xhtml"; constexpr char divLocalName[] = "div"; constexpr char pLocalName[] = "p"; void testContainsAnyName() { using PQName = QName<xhtmlNS, pLocalName>; static_assert(contains<AnyName,PQName>::value, "AnyName should match PQName."); } void testContainsQName() { using PQName = QName<xhtmlNS, pLocalName>; using DivQName = QName<xhtmlNS, divLocalName>; static_assert(!contains<DivQName,PQName>::value, "DivPQName should not match PQName."); } void testContainsNsName() { using PQName = QName<xhtmlNS, pLocalName>; using HtmlNsName = NsName<xhtmlNS>; static_assert(contains<HtmlNsName,PQName>::value, "HtmlNsName should match PQName."); } void testContainsNameClassChoice() { using PQName = QName<xhtmlNS, pLocalName>; using DivQName = QName<xhtmlNS, divLocalName>; using HtmlNsName = NsName<xhtmlNS>; using NameChoice = NameClassChoice<DivQName,HtmlNsName>; static_assert(contains<NameChoice,PQName>::value, "NameChoice should match PQName."); }

These functions do not even need to be called from anywhere (at least not in GCC, maybe other compilers behave differently) to run these tests. They should, of course, be in a compilation unit.

Again, why are you doing this?

I have translated (most of) this Relax NG validation algorithm to C++ metaprogramming. You can look at the progress so far.

The point of doing this is to further Blasien. Generating XML from C++ (and most programming languages for that matter) is a source of many invalid documents because doing validation at either runtime or compile time is currently incomplete and cumbersome. With a convenient way to write XML inside of C++ I hope this will change. Doing most of the validation at compile time will catch most of the common errors. Solving nice puzzles and learning more Haskell and C++ while developing Blasien is a nice bonus.

Plasma 5 and kdus testing

Fri, 07/10/2015 - 22:03

Thanks to Mike Pagano who enabled kdbus support in Gentoo kernel sources almost 2 weeks ago. Which gives us the choice to test it. As described in Mikes blog post you will need to enable the use flags kdbus and experimental on sys-kernel/gentoo-sources and kdbus on sys-apps/systemd.

root # echo "sys-kernel/gentoo-sources kdbus experimental" >> /etc/portage/package.use/kdbus

If you are running >=sys-apps/systemd-221 kdbus is already enabled by default otherwise you have to enable it.

root # echo "sys-apps/systemd kdbus" >> /etc/portage/package.use/kdbus

Any packages affected by the change need to be rebuilt.

root # emerge -avuND @world

Enable kdbus option in kernel.

General setup --->
<*> kdbus interprocess communication

Build the kernel, install it and reboot. Now we can check if kdbus is enabled properly. systemd should automatically mask dbus.service and start systemd-bus-proxyd.service instead (Thanks to eliasp for the info).

root # systemctl status dbus
● dbus.service
Loaded: masked (/dev/null)
Active: inactive (dead)


root # systemctl status systemd-bus-proxyd
● systemd-bus-proxyd.service - Legacy D-Bus Protocol Compatibility Daemon
Loaded: loaded (/usr/lib64/systemd/system/systemd-bus-proxyd.service; static; vendor preset: enabled)
Active: active (running) since Fr 2015-07-10 22:42:16 CEST; 16min ago
Main PID: 317 (systemd-bus-pro)
CGroup: /system.slice/systemd-bus-proxyd.service
└─317 /usr/lib/systemd/systemd-bus-proxyd --address=kernel:path=/sys/fs/kdbus/0-system/bus

Plasma 5 starts fine here using sddm as login manager. On Plasma 4 you may be interested in Bug #553460.

Looking forward when Plasma 5 will get user session support.

Have fun!

ownCloud Chunking NG Part 2: Announcing an Upload

Fri, 07/10/2015 - 15:40

The first part of this little blog series explained the basic operations of chunk file upload as we set it up for discussion. This part goes a bit beyond and talks about an addition to that, called announcing the upload.

With the processing described in the first part of the blog, the upload is done savely and with a clean approach, but it also has some drawbacks.

Most notably the server does not know the target filename of the uploaded file upfront. Also it does not know the final size or mimetype of the target file. That is not a problem in general, but imagine the following situation: A big file should be uploaded, which would exceed the users quota. That would only become an error for the user once all uploads happened, and the final upload directory is going to be moved on the final file name.

To avoid useless file transfers like that or to implement features like a file firewall, it would be good if the server would know these data at start of the upload and stop the upload in case it can not be accepted.

To achieve that, the client creates a file called _meta in /uploads/ before the upload of the chunks starts. The file contains information such as overall size, target file name and other meta information.

The server’s reply to the PUT of the _meta file can be a fail result code and error description to indicate that the upload will not be accepted due to certain server conditions. The client should check the result codes in order to avoid not necessary upload of data volume of which the final MOVE would fail anyway.

This is just a collection of ideas for an improved big file chunking protocol, nothing is decided yet. But now is the time to discuss. We’re looking forward to hearing your input.

The third and last part will describe how this plays into delta sync, which is especially interesting for big files, which are usually chunked.


Kontact on Windows

Fri, 07/10/2015 - 12:49

I recently had the dubious pleasure of getting Kontact to work on windows, and after two weeks of agony it also yielded some results =)

Not only did I get Kontact to build on windows (sadly still something to be proud off), it is also largely functional. Even timezones are now working in a way that you can collaborate with non-windows users, although that required one or the other patch to kdelibs.

To make the whole excercise as reproducible as possible I collected my complete setup in a git repository [0]. Note that these builds are from the kolab stable branches, and not all the windows specific fixes have made it back upstream yet. That will follow as soon as the waters calm a bit.

If you want to try it yourself you can download an installer here [1],
and if you don’t (I won’t judge you for not using windows) you can look at the pretty pictures.

[0] https://github.com/cmollekopf/kdepimwindows
[1] http://mirror.kolabsys.com/pub/upload/windows/Kontact-E5-2015-06-30-19-41.exe

Kontact_WindowsAccount_Wizard


Calligra 2.9.6 Released

Fri, 07/10/2015 - 06:32

We are pleased to announce that Calligra Suite, and Calligra Active 2.9.6 have just been released. This recommended update brings further improvements to the 2.9 series of the applications and underlying development frameworks.

Support Calligra! Bugfixes in This Release

Here is an overview of the most important fixes. There are several others that may be not mentioned here.

General
  • Fix building on CentOS and Windows
  • Fix Collapse button in docker titlebar shown even if set to uncollapsable
  • Fix saving of sections inside tables
  • Fix “keep lines together” check box in paragraph check box
  • Restore loading of default styles (ODF’s defaultstyles.xml) and make it work when multiple document types are used in the very same application (for example in Gemini).
  • Fix disappearing blocks during relayouts. Symptoms would be that the block would appear and reappear depending on both pages being layouted or not (bug 345621)
  • Make sure the cursor is correct when hovering the corner handles of a shape (bug 347843)
  • Fix preset docker flickering and slowdown (related to bug 344968)
  • Make sure the footnote styles are found and that the footnote number (in the footnote itself) is present in the right baseline (using the correct footnote style when creating a new footnote is a bit problematic though, LibreOffice doesn’t see to actually save the style) (bug 323232)
  • Don’t delete when at the end of table cell (bug 346297)
  • Fix an issue where some config files may not be picked up
Kexi
  • General:
    • Fix renames for file storing the Welcome status bar’s GUIs
    • Recent Projects: use file’s base name as a good replacement for caption when caption is not available
    • Fix left margin for the global search box (dependent on style); also react on changing widget style
    • Fix possible crash caused by command line arguments passed to Kexi in a wrong way
    • Fix crash appearing when the –hide-menu command line option is used
  • Queries:
    • Fix possible crash in result handling of queries
  • SQLite databases:
    • Fix compacting databases (properly rename files back to the original name)
  • PostgreSQL databases:
    • Fix crash when importing a PostgreSQL database to a .kexi file (bug 349156)
Krita
  • New Features:
    • Add possibility to continue a Crop Tool action
    • Speed up of color balance, desaturate, dodge, hsv adjustment, index color per-channel and posterize filters.
    • Activate Cut/Copy Sharp actions in the menu
    • Implemented continuation of the transform with clicking on canvas
    • new default workspace
    • Add new shortcuts (‘\’ opens the tool options, f5 opens the brush editor, f7 opens the preset selector.)
    • Show the tool options in a popup (toggle this on or off in the general preferences, needs restarting Krita)
    • Add three new default shortcuts (Create group layer = Ctrl+G, Merge Selected layer = Ctrl+Alt+E, Scale image to new size = Alt+Ctrl+I )
    • Add an ‘hide pop-up on mouseclick option’ to advanced color selector.
    • Make brush ‘speed’ sensor work properly
    • Allow preview for “Image Background Color and Transparency” dialog.
    • Selection modifier patch is finally in! (shift=add, alt=subtract, shift+alt=intersect, ctrl=replace. Path tool doesn’t work yet, and they can’t be configured yet)
  • Bug fixes:
    • Fix crash when saving a pattern to a *.kra (bug 346932)
    • Make Group Layer return correct extent and exact bounds when in pass-through mode
    • Make fixes to pass-through mode.
    • Added an optional optimization to slider spin box
    • Fix node activating on the wrong image (bug 348599)
    • Fix deleting a color in the palette docker (bug 349792)
    • Fix scale to image size while adding a file layer (bug 349823)
    • Fix wrapping issue for all dial widgets in Layer Styles dialog
    • Fix calculation of y-res when loading .kra files
    • Prevent a divide by zero (bug 349598)
    • Reset cursor when canvas is extended to avoid cursor getting stuck in “pointing hand” mode (bug 347800)
    • Fix tool options visibility by default (bug 348730)
    • Fix issue where changing theme doesn’t update user config (bug 349446)
    • Fix internal brush name of LJF smoke (bug 348451)
    • Set documents created from clipboard to modified (bug 349424)
    • Make more robust: check pointers before use (bug 349451)
    • Use our own code to save the merged image for kra and ora (is faster)
    • Fix Hairy brush not to paint black over transparent pixels in Soak Ink mode (bug 313296)
    • Fix PVS warning in hairy brush
    • Don’t limit the allowed dock areas (bug 348750)
    • Fix uninitialized m_maxPresets (bug 348795)
    • (gmic) Try to workaround the problem with busy cursor
    • (gmic) If there is selection, do not synchronize image size (bug 349346)
    • Disable autoscroll for the fill-tool as well (bug 348887)
    • Rename the fill layers (bug 348914)
  • Calligra Plan
    • Restore option to add items to reports in Plan’s report designer
    • Fix and update background app icon in Plan about HTML view
    Calligra Words
    • Fix crash appearing sometimes with page breaks, and make page breaks work correctly on multi-column pages
    • Handle column breaks
    • Fix regressions that made floating text shapes geometry protected (bug 345426)
    • Fix a crash on closing a second document (bug 336145)


    Try It Out

    Download small The source code of the release is available for download here: calligra-2.9.6.tar.xz.
    Also translations to many languages and MD5 sums.
    Alternatively, you can download binaries for many Linux distributions and for Windows.


    What’s Next and How to Help?

    The next step after the 2.9 series is Calligra 3.0 which will be based on new technologies. We expect it later in 2015.

    You can meet us to share your thoughts or offer your support on general Calligra forums or dedicated to Kexi or Krita. Many improvements are only possible thanks to the fact that we’re working together within the awesome community.

    (Some Calligra apps need new maintainers, you can become one, it’s fun!) How and Why to Support Calligra?

    Calligra apps may be totally free, but their development is costly. Power, hardware, office space, internet access, travelling for meetings – everything costs. Direct donation is the easiest and fastest way to efficiently support your favourite applications. Everyone, regardless of any degree of involvement can do so. You can choose to:
    Heart Support entire Calligra indirectly by donating to KDE, the parent organization and community of Calligra: http://www.kde.org/community/donations.

    Heart Support Krita directly by donating to the Krita Foundation, to support Krita development in general or development of a specific feature: https://krita.org/support-us/donations.

    Heart Support Kexi directly by donating to its current BountySource fundraiser, supporting development of a specific feature, or the team in general: https://www.bountysource.com/teams/kexi. About the Calligra Suite

    Calligra Suite is a graphic art and office suite developed by the KDE community. It is available for desktop PCs, tablet computers and smartphones. It contains applications for word processing, spreadsheets, presentation, databases, vector graphics and digital painting. For more information visit calligra.org.


    About KDE

    KDE is an international technology team that creates free and open source software for desktop and portable computing. Among KDE’s products are a modern desktop system for Linux and UNIX platforms, comprehensive office productivity and groupware suites and hundreds of software titles in many categories including Internet, multimedia, entertainment, education, graphics and software development. KDE’s software available in more than 60 languages on Linux, BSD, Solaris, Windows and Mac OS X.

    } .button:hover{ padding:11px 32px; border:solid 1px #004F72; -webkit-border-radius:10px; -moz-border-radius:10px; border-radius: 10px; font:18px Arial, Helvetica, sans-serif; font-weight:bold; color:#E5FFFF; background-color:#3BA4C7; background-image: -moz-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -webkit-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -o-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -ms-linear-gradient(top, #3BA4C7 0% ,#1982A5 100%); filter: progid:DXImageTransform.Microsoft.gradient( startColorstr='#1982A5', endColorstr='#1982A5',GradientType=0 ); background-image: linear-gradient(top, #3BA4C7 0% ,#1982A5 100%); -webkit-box-shadow:0px 0px 2px #bababa, inset 0px 0px 1px #ffffff; -moz-box-shadow: 0px 0px 2px #bababa, inset 0px 0px 1px #ffffff; box-shadow:0px 0px 2px #bababa, inset 0px 0px 1px #ffffff;

    } .button:active{ padding:11px 32px; border:solid 1px #004F72; -webkit-border-radius:10px; -moz-border-radius:10px; border-radius: 10px; font:18px Arial, Helvetica, sans-serif; font-weight:bold; color:#E5FFFF; background-color:#3BA4C7; background-image: -moz-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -webkit-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -o-linear-gradient(top, #3BA4C7 0%, #1982A5 100%); background-image: -ms-linear-gradient(top, #3BA4C7 0% ,#1982A5 100%); filter: progid:DXImageTransform.Microsoft.gradient( startColorstr='#1982A5', endColorstr='#1982A5',GradientType=0 ); background-image: linear-gradient(top, #3BA4C7 0% ,#1982A5 100%); -webkit-box-shadow:0px 0px 2px #bababa, inset 0px 0px 1px #ffffff; -moz-box-shadow: 0px 0px 2px #bababa, inset 0px 0px 1px #ffffff; box-shadow:0px 0px 2px #bababa, inset 0px 0px 1px #ffffff; }

    .button a,.button a:link, .button a:visited, .button a:hover, .button a:active { color:#E5FFFF; } -->

Pages