During an analysis from metrics stored in an InfluxDB Time Series Database (TSDB), a third party company got involved. Unfortunately they were not able to understand what the Unix Nano Timestamp is and needed a translation to "human" date format. Luckily it's fairly easy to change the date/time format.
Metrics are usually stored with a nano timestamp in TSDB's. This allows (in theory) to collect data only a nano second later, helping to do a very (very very!) deep analysis on metrics.
While labels and tags can change on the metrics data, depending on a table or measurement (InfluxDB), the time column is a must to identify the timestamp of a metric. Without it a time series does not make sense. It's even in the name, so that should be enough of a clue.
Here's an example how this looks inside a TSDB measurement (here InfluxDB v1):
> SELECT import_energy_active FROM meter WHERE time > '2024-11-12T04:50:00Z' AND time < '2024-11-12T05:15:00Z';
name: meter
time import_energy_active
---- --------------------
1731387036000000000 12573859
1731387097000000000 12573916
1731387158000000000 12573973
1731387218000000000 12574034
1731387279000000000 12574046
1731387340000000000 12574062
1731387401000000000 12574077
1731387462000000000 12574090
1731387523000000000 12574107
1731387583000000000 12574121
1731387644000000000 12574134
1731387754000000000 12574202
The data in question was stored into a text file and represents the time and import_energy_active columns:
ck@mint ~ $ cat /tmp/data.txt
1731387036000000000 12573859
1731387097000000000 12573916
1731387158000000000 12573973
1731387218000000000 12574034
1731387279000000000 12574046
1731387340000000000 12574062
1731387401000000000 12574077
1731387462000000000 12574090
1731387523000000000 12574107
1731387583000000000 12574121
1731387644000000000 12574134
1731387754000000000 12574202
While the second column represents the value and doesn't change, there are two operations I had to apply on the first (time) column:
To get the Unix timestamp it's as easy as divide the nano timestamp by 1000000000:
ck@mint ~ $ tsnano=1731387036000000000
ck@mint ~ $ tsunix=$(( ${tsnano} / 1000000000 ))
ck@mint ~ $ echo ${tsunix}
1731387036
And to convert the Unix timestamp into a human readable date, we can use the date command:
ck@mint ~ $ echo $(date -d @${tsunix})
Tue Nov 12 05:50:36 AM CET 2024
Based on the previous two operations, I simply launched the following while loop one-liner, reading the text file and converting the nano timestamps into human readable dates:
ck@mint ~ $ while read tsnano value; do echo "$(date -d @$(( $tsnano / 1000000000))) $value"; done < /tmp/data.txt
Tue Nov 12 05:50:36 AM CET 2024 12573859
Tue Nov 12 05:51:37 AM CET 2024 12573916
Tue Nov 12 05:52:38 AM CET 2024 12573973
Tue Nov 12 05:53:38 AM CET 2024 12574034
Tue Nov 12 05:54:39 AM CET 2024 12574046
Tue Nov 12 05:55:40 AM CET 2024 12574062
Tue Nov 12 05:56:41 AM CET 2024 12574077
Tue Nov 12 05:57:42 AM CET 2024 12574090
Tue Nov 12 05:58:43 AM CET 2024 12574107
Tue Nov 12 05:59:43 AM CET 2024 12574121
Tue Nov 12 06:00:44 AM CET 2024 12574134
Tue Nov 12 06:02:34 AM CET 2024 12574202
Fairly easy and very quick to do.
However I'm disappointed in this (global and large) external company not being able to understand or figure this out on their own.
No comments yet.
AWS Android Ansible Apache Apple Atlassian BSD Backup Bash Bluecoat CMS Chef Cloud Coding Consul Containers CouchDB DB DNS Database Databases Docker ELK Elasticsearch Filebeat FreeBSD Galera Git GlusterFS Grafana Graphics HAProxy HTML Hacks Hardware Icinga Influx Internet Java KVM Kibana Kodi Kubernetes LVM LXC Linux Logstash Mac Macintosh Mail MariaDB Minio MongoDB Monitoring Multimedia MySQL NFS Nagios Network Nginx OSSEC OTRS Observability Office OpenSearch PGSQL PHP Perl Personal PostgreSQL Postgres PowerDNS Proxmox Proxy Python Rancher Rant Redis Roundcube SSL Samba Seafile Security Shell SmartOS Solaris Surveillance Systemd TLS Tomcat Ubuntu Unix VMWare VMware Varnish Virtualization Windows Wireless Wordpress Wyse ZFS Zoneminder