Bye bye Drupal, hello WordPress

After failing to properly migrate this Blog from Drupal 7 to Drupal 8, I decided to try a migration to WordPress.

The Drupal site used Postgresql as DB backend, so first I had to migrate to MySQL using the DBTNG Migrator plugin.

Afterwards I used the Drupal2WordPress Plugin to migrate all nodes including taxonomy and comments. The Plugin works perfectly fine with WordPress 5.x, you just need to install it manually.

Host your own private image gallery using Piwigo and Nginx

You want to show some photos of the kids to your family in a secure fashion, but do not want to rely on the cloud folks to store your data? Then running Piwigo on your home server or a VPS is a great way to do exactly that.

Piwigo is free, open source software with a rich feature set and lots of available plugins and themes. If you’re using Lightroom to manage your photo library, there is the Piwigo publisher Plug-In by alloyphoto (one of the best 15$ I’ve ever spent, great support included). Read More

The way to Elasticsearch 2.0, or how to reindex your dot fields with logstash and ruby filters

The Elasticsearch 2.0 release intruced a major annoyance by removing support for dots in field names. We use ES for our apache logs, with retention policy of 365 days, and of course _all_ of the indices contained fields with a dot in the name.

What’s even worse, at some point in time i had the idea to filter out the request parameters from the uri and run a kv filter on it. As we never used the resulting mess of request_params.* fields, those could just be dropped.

First step was to update our logstash configuration so no dots are used for field names.

Then we needed an automated way of re-indexing all of our indices, replacing all dots (.) with underscore (_) in the field names, dropping irrelevant fileds and move all data into a new index. I came up with method using logstash and a ruby filter, wrapped in a bash script that iterates over all indices, sed’ing the index name into below template, an running logstash with it. Logstash will shutdown itself after the index is read in completely.

Read More

Indexing and searching Weblogic logs using Logstash, Elasticsearch and Kibana

This is a re-edit of my previous post “Indexing and searching Weblogic logs using Logstash and Graylog2”. Meanwhile our setup has settled to use Kibana instead of the Graylog2 frontend. This Howto is meant to be a complete installation guide for “The Elasticsearch ELK stack” and using it to index tons of Weblogic server and application logs, from DEV over UA to the Production environment. Read More

Howto migrate SonarQube from MySQL to Oracle

Recently we had the need to move Sonar off our small virtualized MySQL server due to the fact that the Sonar database has begun to grow huge. Really HUGE. We’d like to keep data for about 3 months, and 1 month is already worth several GB of data, and our MySQL server isn’t setup for this amount of data.

So we decided to move it to our Oracle database. Thanks to SQL Developer, this was a quite easy process. Read More

Howto easily update GPS-A data on a Sony Alpha 65/77/99 and others on Linux/Mac

In order to speed up GPS locking on a Sony Alpha 65 (or similar) SLT camera, it’s possible to update the GPS-A data (also called almanac data). Like on any other modern GPS device, the almanac data is used to give the device a hint where the satellites are located. The data usually is valid for only some weeks,then it needs to updated again.

The common way to get the almanac data on a Sony Alpha is to use Sony’s Software, so no go for Linux users. But there are other ways to do it. Read More