Getting data byajaxasjson, the problem is that the data weighs in total from 10 to 45 megabytes, which, to put it mildly, is not very good at constantly driving such a volume.

How to receive data in a compressed form, and on the client side in the browser, bring this data to its original form and do further processing?

If this data is compressed with normalziparchiver, then the size is only 40-150 kilobytes, but how to implement this through ajax?

header('Content-Type: application/json; charset=utf-8');
echo json_encode($data);
function getData() {
  .then(response=> response.json())
  .then(data=> {
    console log(data);
  .catch((error)=> {

A normal web server usually knows how to automatically compress everything in gzip or brotli. What web server do you have and how is it configured?

andreymal2022-01-25 18:24:13

@andreymal none yet, I'm working locally. That is, if there is gzip, then all data will be transferred in a compressed form? And where can you trace what they are in compressed? Will it be visible in webtools in the network tab? There I see what data is transmitted and its size

Meru3822022-01-25 18:24:13

The web server specifies Content-Encoding in the response headers; if it is not there, then it is not compressed. As a last resort, you can compress manually and add Content-Encoding manually via php too, but it's probably better not to do this if the web server itself supports compression

andreymal2022-01-25 18:24:13