Not all forums are open to self-promotion. I built a workaround for this.

Image for post
Image for post

We all know how exciting building a product is. Less exciting is convincing other people that our product is the right solution for their use-case.

After coding up the best version of our product, the marketing starts (even if we should have started earlier). Indiehackers, Product Hunt, and Hacker News is the bare minimum we can do to let the world know our baby is ready, and that we want people to try it out. However, what we really want is finding real users that find our product useful. And this is when (of course we should have done it before building our product) we look for forums were we could reply with our fantastic product link and save the day. First thing that comes to mind: let’s check out Reddit! At this point, you think that you are providing at least some value to at least someone. …

Sell a list using Google sheets and Stripe

Image for post
Image for post

This is a follow up article to how to sell a table. After trying out different options, I ended up building my own platform to easily sell a list of things:

I tried to make it really easy:
- You can prepare the list on Google sheets, and connect it to Tabslu
- Connect a Stripe account
- You are ready to sell it

Step by step example

Here I show how I built the examples on Tabslu’s landing page:

The easiest way to sell tabular data on the internet

Image for post
Image for post

Few months ago, I was collecting some data about companies and their customers. I thought that this could have been valuable to someone. In particular, this could be valuable for companies that want to understand who their competitors’ customers are.

But yeah, I started to collect some data in a table. To make it look more appealing I would have added some images, but basically, the table looked like this:

Why BERT is not the best choice for multilingual tasks

Image for post
Image for post
Image obtained translating “multilingual transformers” with and using

Last year, we saw rapid improvements in transformer architectures. Being the GLUE benchmark the main reference point for the state-of-the-art in language understanding tasks, most of the research efforts focused on English data. BERT, RoBERTa, DistilBERT, XLNet — which one to use? provides an overview of recent transformer architectures and their pros and cons.

It is challenging to keep track of the GLUE leader board because the progress on language understanding tasks is so fast-paced. Every month a different team takes the top position.

I had this brief go at some data provided by AIliveSim built a simulation tool to train and test agents in very realistic 3D environments.

I was really impressed by some of their maritime simulations. Snapshots from their simulated 3D environments look really real. This includes realistic looking sea surfaces with waves, as well as reflexes and eventual splashes of water on the camera lens.

Image for post
Image for post
Snapshot of a simulation taken at daytime.

Progressive Generative Adversarial Networks (GANs) at ICLR 18 this year achieved incredible results on synthetic faces generation. Some of the improvements of this paper are also due to a carefully curated data set of celebrity photos in high resolution (1024 by 1024 pixels). Given a latent vector in a 512-dimensional space, this neural network outputs a very realistic face that resembles famous actors. These are some of the faces a Progressive GAN can generate:

Image for post
Image for post
Taken from

It is also possible to interpolate between faces by linear interpolation of two latent vectors. I found this useful Gist by mat kelcey that uses TensorFlow hub to interpolate between latent vectors. TensorFlow hub provides a collection of pre-trained networks that can be called straight from a python script. These networks do not need to be explicitly downloaded either, which is quite convenient. They provide a trained Progressive GAN that I can even run on my laptop with no GPU. Unfortunately, this has some limitations: its resolution is 128 by 128 pixels and it does not seem to be trained on the same high quality data set of the original paper. Anyway when interpolating between spaces, results are very smooth and natural. …

PyData Seattle 2017 was held in Microsoft Redmond. It was might first time attending a non academic conference on data science. I have to say that I really enjoyed it mostly because several talks were hands on and provided jupyter notebooks. Also, most of the presenters were such good public speakers!

Here I am going to blog about the talks I enjoyed the most (among the ones I attended). I am going to make use of the many tweets in feed @PyDataSeattle:

PyData Seattle 2017 kicked off on the 7th of July in Microsoft Redmond.

Highlights of the Conference

The most retweetted tweet of this conference is by far the keynote by Jake VanderPlas: “PyData 101”. He gave a smooth introduction about the origin of PyData and the many packages for data science that are available out there. Python was born as scripting language and now it is also a more than valid alternative to R. …

We all got excited with the recent developments of deep neural networks. Among the different applications of deep learning, Natural Language Processing (NLP) applications have attracted quite a bit of interest. It is really great to see a machine learning model generating with high accuracy text that resembles Shakespeare, Wikipedia, Harry Potter, Obama speeches, Star Wars episodes, and ultimately even code.

Is it possible to automatize travel blogging with artificial intelligence? It would be great if your AI assistant helped you keeping memories of your travels by automatically writing down stories about the places you visited. …


Simone Romano


Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store