Wednesday, 17 June 2015
Elasticsearch and mappings
Testing out Kibana for a spesific use-case and wanted to geo code data, however this was not "as easy" as 1..2..3 but once you know the little tricks / pre-requisites a working demo can be configured within minutes.
The flow of events was as follows:
Some observations and future posts perhaps:
- know your source data and what you want to achieve
- Learn grok, a great tool to debug grok patterns
- Standardise you message formats
- An elasticsearch cluster really needs 3 nodes (split-brain issue ..)
- schema-less is not really schema-less and "mapping" fields needs some practice to get used to.
- geo data for visualization in Kibana needs to be mapped (the type set) to "geo_point" as auto classify see it as text/string.
The focus of this post is on creating the correct mapping for map visualizations.
Logstash:
Install GeoIP database (MaxMind free for the prove of concept)
cd /etc/logstash
sudo curl -O "http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz"
sudo gunzip GeoLiteCity.dat.gz
sudo curl -O "http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz"
sudo gunzip GeoLiteCity.dat.gz
This will be used as the geoip database value in the logstash configuration file (/etc/logstash/conf.d/*)
Do the filtering of the input stream via grok filter and patterns and make sure to assign the IP address values you want to map to a field.
Specify that field as the source in geoip filter.
Sample filter configuration:
Loading ....
Then send the output to elasticsearch host or cluster and the correct index in the output.
Test your logstash configuration with:
$ sudo /opt/logstash/bin/logstash --configtest --config /etc/logstash/conf.d/01-src_ip-mikrotik.conf
Configuration OK - means all ok and good to go.
Elasticsearch:
Download and install elasticsearch
wget https://download.elastic.co/elasticsearch/elasticsearch/elasticsearch-1.6.0.deb
sudo dpkg -i elasticsearch-1.6.0.deb
sudo dpkg -i elasticsearch-1.6.0.deb
Configure the settings in /etc/elasticsearch/elasticsearch.yml; for a prove of concept the minimum settings needed (assuming broadcast traffic is allowed between the elasticsearch nodes) are
cluster.name: Africa
node.name: "Elephant"
path.data: /media/raid/elasticsearch
discovery.zen.minimum_master_nodes: 2
and start up elasticsearch
sudo service elasticsearch start
Mapping:
At last we get to mapping, now under normal circumstances you would:
- Configure and make sure your device is pushing the data (syslog in this instance) to logstash
- Start logstash
- Logstash will
- ingest the data as configured in "input"
- apply the grok filters - to assign values to fields
- use the IP address field (src_ip in this example) and feed it to geoip
- geoip will retrieve data from GeoIP database and add the values to the geoip fields
- return the results as configured in the "output"
- Elasticsearch will receive the data and Automagically create types for each field (this is called dynamic mapping) as soon as it receives the first message.
- Search and do whatever you want to do at this stage (Install and open kibana, add index pattern and analyse ..)
However the following holds true;
You can ONLY specify the mapping for a type when you first create an index.New fields can be addedafter the index was created but existing ones cannot be changed.
A good source to read here
Now to create a field called location (containing LON & LAT coordinates) with a type of geo_point to use in Kibana for map visualization we need to do the following and in this order:
- STOP all sources that push data to elasticsearch for the specific index. (in our example: stop logstash)
- DELETE the index (Remember we can't update existing field types)
- MAP the new index (PUT)
- Confirm the mapping (GET)
- Start logstash and feed elasticsearch data
- Enjoy ....
I use a Firefox add-on called RESTClient to send the rest command to elasticsearch.
Sample commands to perform the above steps: (used tester as the index name)
To delete the index:
DELETE /tester
To create the new mapping:
PUT /tester
with the source of the mapping (Body):
Loading ....
To view / confirm the new mapping:
GET /tester/_mapping
Till a next time ....
Saturday, 13 June 2015
Kubuntu DNS settings
Seem like DNS settings are no longer stored in resolv.conf but this can still be extracted from the netwok Manager CLI.
This will return information including:
Therefore a quick awk can return the value needed:
nmcli dev show [device] e.g. nmcli dev show wlan0
This will return information including:
IP4.DNS[1]:
IP4.DOMAIN[1]:
IP4.DOMAIN[1]:
Therefore a quick awk can return the value needed:
nmcli dev show wlan0 | grep IP4.DNS | awk -F: '{ print $2 }' | awk '{ print $1 }'
Monday, 8 June 2015
Graylog2 not starting
Upon investigation it was found that mongoDB did not start up, this was due to a .lock file still present (not cleanly removed on reset)
to remove the lock:
sudo rm /var/lib/mongodb/mongod.lock
and then start mongodb with
sudo service mongodb start
If it allocates a process id (pid) to it you know it is up and running, can be further confirmed with
sudo service mongodb status
Tuesday, 5 May 2015
Streaming music via Airplay from Ubuntu to Pioneer VSX-922
So after numerous complaints from my wife that the music setup in our house is not user friendly for her and she can only listen to ad-hoc music from the laptop (as a iPod have a playlist synced to it) I did some research and surprisingly found that pulseaudio had a Airplay module (raop)
The Pioneer amp needed UDP and not TCP streaming as per the native module, so I had to compile and create a icon on the desktop to "click and then listen to music"
That approach is sort of user friendly (still a few clicks to swop between the two pulseaudio versions but it works) next step is to fully automate it.
Installation:
The complete installation instructions can be found here : http://hfujita.github.io/pulseaudio-raop2/
But the steps are summarised below:
Thenyou need to configure it to discover AirPlay devices on the network
paprefs &
In the
Then I wrote a little bash script named "music on amp" with a popup instructing her what to do next.
So currently it is a "guided automation" but it works.
Desktop icon: (music_on_amp.desktop)
Bash script: (listen_on_pioneer.sh)
I have added a icon & script to change the music back to the laptop (as the Pioneer kept on changing to media source when the laptop send a notification or played music)
Bash script: (listen_on_pc.sh)
this will kill and start the original pulseaudio and set the source (sink) to the in build speakers.
The complete installation instructions can be found here : http://hfujita.github.io/pulseaudio-raop2/
But the steps are summarised below:
sudo apt-get install build-essential paprefs git pulseaudio-module-raop intltool sudo apt-get build-dep pulseaudio git clone https://github.com/hfujita/pulseaudio-raop2.git cd pulseaudio-raop2 ./autogen.sh CFLAGS="-ggdb3 -O0" LDFLAGS="-ggdb3" ./configure --prefix=$HOME --enable-x11 --disable-hal-compat make
Thenyou need to configure it to discover AirPlay devices on the network
paprefs &
In the
Network Access
tab, turn on Make discoverable Apple AirTunes sound devices available locally
Then I wrote a little bash script named "music on amp" with a popup instructing her what to do next.
So currently it is a "guided automation" but it works.
Desktop icon: (music_on_amp.desktop)
[Desktop Entry] Encoding=UTF-8 Name=Airplay Comment=Play music to Airplay enabled devices Exec=/home/user/listen_on_Pioneer.sh Icon=/home/user/Pictures/airplay.png Type=Application Name[en_GB]=Music on AMP
Bash script: (listen_on_pioneer.sh)
#!/bin/bash # source installed from http://hfujita.github.io/pulseaudio-raop2/ # kill the running pulseaudio pulseaudio -k cd ~/pulseaudio-raop2 ./src/pulseaudio -n -F src/default.pa -p $(pwd)/src/ & zenity --info --text 'Dear wife, Please select VSX-922 on next box that will pop up and close' unity-control-center sound notify-send "Now open Amarok & Enjoy your music my love ..."
I have added a icon & script to change the music back to the laptop (as the Pioneer kept on changing to media source when the laptop send a notification or played music)
Bash script: (listen_on_pc.sh)
#!/bin/bash # kill the running pulseaudio pulseaudio -kpulseaudio -D # to list the sink # pacmd list-sinks | grep -e 'name:' -e 'index' pacmd set-default-sink 1 pulseaudio -D notify-send "Music will play on PC."
this will kill and start the original pulseaudio and set the source (sink) to the in build speakers.
Monday, 4 May 2015
Sublime Text Mikrotik plugin for text highlighting & auto complete
I had some challenges that I had to resolve on a network (more of that in my next post), and could only access the CLI with the need to see the complete config at once, I used Sublime Text to view the file however having to look at any log or configuration file in one colour is not fun.
Had a quick look and was in luck, there is a Mikrotik Sublime Text plugin, many thanks Kentzo. It can be installed via package manager or be found on Kentzo's Github repository.
The simplest method of installation is through the Sublime Text console as described here. The console is accessed via the ctrl+` shortcut or the View > Show Console menu. Once open, paste the appropriate Python code for your version of Sublime Text into the console.
import urllib.request,os,hashlib; h =
'eb2297e1a458f27d836c04bb0cbaf282' + 'd0e7a3098092775ccb37ca9d6b2e4b7d';
pf = 'Package Control.sublime-package'; ipp =
sublime.installed_packages_path(); urllib.request.install_opener(
urllib.request.build_opener( urllib.request.ProxyHandler()) ); by =
urllib.request.urlopen( 'http://packagecontrol.io/' + pf.replace(' ',
'%20')).read(); dh = hashlib.sha256(by).hexdigest(); print('Error
validating download (got %s instead of %s), please try manual install' %
(dh, h)) if dh != h else open(os.path.join( ipp, pf), 'wb' ).write(by)
After that simply go to Preferences > Package Control > and select install Package; then type the package you want to install.
Labels:
Mikrotik,
Sublime Text
Wednesday, 25 March 2015
Serial console for linux - minicom
Have been using this for quite some time but never posted it on my blog, The best serial port communication program is minicom, working a charm with most USB-to-serial converters as well as x-modem
useful links:
https://help.ubuntu.com/community/Minicom
https://help.ubuntu.com/community/CiscoConsole
I have found that USB to Serial converters don't play so well with Cisco Wireless Access Points, this is not a minicom issue but seems to be a USB driver and flow control issue.
Settings for WAP:
Easiest way to find the serial device in Ubuntu Server:
$ dmesg | grep tty
and application setserial can provide some detail as well; so to return information for ttyS0 to ttyS4:
sudo setserial -g /dev/ttyS[01234]
and it will return something like:
/dev/ttyS0, UART: 16550A, Port: 0x03f8, IRQ: 4
/dev/ttyS1, UART: 16550A, Port: 0x02f8, IRQ: 3
/dev/ttyS2, UART: unknown, Port: 0x03e8, IRQ: 4
/dev/ttyS3, UART: unknown, Port: 0x02e8, IRQ: 3
/dev/ttyS4, UART: unknown, Port: 0x0000, IRQ: 0
Labels:
linux,
Minicom,
Serial Console
Citrix Receiver on K/Ubuntu 14.10
Installing the Citrix client became a lot easier with time.
Only download the latest version from Citrix website here (time of writing it was icaclient_13.1.0.285639_amd64.deb ) and install.
For Firefox; restart your browser and ensure the plugin is set to "Always Activate"
Should you encounter a SSL Error 61 afterwards just link the Firefox certificates to the Citrix ICAClient keystore :
sudo ln /usr/share/ca-certificates/mozilla/* /opt/Citrix/ICAClient/keystore/cacerts
That should sort out the error.
Handy links:
https://help.ubuntu.com/community/CitrixICAClientHowTo
Sunday, 15 February 2015
Rsync server on QNAP nas
To use the Rsync server on QNAP remember to select the "Allow remote Rsync server to back up data to nas"
Then to run a manual rsync backup from Ubuntu in terminal enter the following command:
Then to run a manual rsync backup from Ubuntu in terminal enter the following command:
$ rsync -azvvv /home/user/ servi@[qnap ip or hostname]::user/Ubuntu/
The above command will
-vvv be very verbose
-a is archive mode
-z compress file data
servi - is the username configured on the QNap Rsync screen
user/Ubuntu/ - is the destination folder
Subscribe to:
Posts (Atom)