The Python Elasticsearch Domain Specific Language (DSL) lets you create models via Python objects.
Take a look at the model Elastic creates in their persistence example.
I wrapped their example in a script and named it persist.py. To initiate the model, execute persist.py from the command line.
We can take a look at these mappings via the _mapping API. In the model, Elastic names the index blog. Use blog, therefore, when you send the request to the API.
The save() method of the Article object generated the following automatic mapping (schema).
That’s pretty neat! The DSL creates the mapping (schema) for you, with the right Types. Now that we have the model and mapping in place, use the Elastic provided example to create a document.
Again, I wrapped their code in a script. Run the script.
If you look at the mapping, you see the published_from field maps to a Date type. To see this in Kibana, go to Management –> Index Patterns as shown below.
Now type blog (the name of the index from the model) into the Index Name or Pattern box.
From here, you can select published_from as the time-field name.
If you go to Discover, you will see your blog post.
Logstash, however, uses @timestamp for the time-field name. It would be nice to use the standard name instead of a one-off, custom name. To use @timestamp, we must first update the model.
In persist.py (above), change the save stanza from…
It took me a ton of trial and error to finally realize we need to update @timestamp as a dictionary key. I just shared the special sauce recipe with you, so, you’re welcome! Once you update the model, run create_doc.py (above) again.
Then, go back to Kibana –> Management –> Index Patterns and delete the old blog pattern.
When you re-create the index pattern, you will now have a pull down for @timestamp.
Now go to discover and you will see the @timestamp field in your blog post.
You can go back to the _mapping API to see the new mapping for @timestamp.
This command returns the JSON encoded mapping.
Unfortunately, we still may have a problem. If you notice, @timestamp here is in the form of “April 1st 2017, 19:28:47.842.” If you’re sending a Document to an existing Logstash doc store, it most likely will have the default @timestamp format.
To accomodate the default @timestamp format (or any custom format), you can update the model’s save stanza with a string format time command.
You can see the change in Kibana as well (view the raw JSON).
That’s it! The more you use the Python Elasticsearch DSL, the more you will love it.