There is nothing inherently limiting in the tools you choose, but at that size the backend architecture becomes really important.
I wouldn't use Elasticsearch as a primary datastore, it's not its purpose. Use something like Postgresql (or investigate into Cassandra).
You can use Elasticsearch as a cache for the "aggregated view" of your data (after having joined together all 15-20 datapoints) but if you are not doing full-text search over it, you may as well keep using Postgresql.
A configuration that may work is Postgresql with 2 sets of tables, one with the original data and one with an optimized structure for the most common operations your product does.
I have seen this sort of thing working well for processing 20 million records every day, but we spent a fair amount of time optimizing it and the database rows were quite independent.