The download fails when attempting without filtering for any particular assembly, i.e, when we try to download all QH data. It goes up to a certain "X% downloaded" and then the download simply halts.
The file seems to be too big. A single assembly seems to be around 200MB , so all the data put together would be around 800MB. Rather than parsing this through React + ElasticSearch, it may be better to serve this as a static file via nginx? That's how we treat LokDhaba_JS static files now, there's a script to compile the files that need to be served and then we serve them. We could do the same for QHSearch, that would make the whole process a lot faster as well - there would be 1 file each for each Assembly, and then one file for all assemblies put together, served via nginx.
The download fails when attempting without filtering for any particular assembly, i.e, when we try to download all QH data. It goes up to a certain "X% downloaded" and then the download simply halts.
The file seems to be too big. A single assembly seems to be around 200MB , so all the data put together would be around 800MB. Rather than parsing this through React + ElasticSearch, it may be better to serve this as a static file via nginx? That's how we treat LokDhaba_JS static files now, there's a script to compile the files that need to be served and then we serve them. We could do the same for QHSearch, that would make the whole process a lot faster as well - there would be 1 file each for each Assembly, and then one file for all assemblies put together, served via nginx.