Home>

I am currently studying the Information Processing Engineer Exam.

I have a question about the question of buffering time for Q31 in the fall of 2017.

The following is the question sentence.

  

To download 1.2 Mbytes of audio data with an encoding speed of 64 kbit/s using a network with a communication speed of 48 kbit/s without interruption, at least a few seconds before starting playback. Do you need to buffer minutes of data?

I understand that the audio playback time is 150 seconds based on the encoding speed, the download time is 200 seconds based on the communication speed, and the difference between them requires that the buffering time be 50 seconds.

I understand that the playback time is calculated from the encoding speed, but I don't understand why.

Encoding is the conversion of an analog signal into a digital signal, or by compressing digital data. We recognize this as the latter in this problem, but even this recognition is wrong in the first place. Wow?

I couldn't find any information that would be helpful to me, so I asked a question. Please teach me.

  • Answer # 1

      

    Audio data with an encoding speed of 64 kbit/s

    It means 64Kbit per second playing time
    Converting voice data (waveform data) for 1 second into 64K bit data (converted to be)
    Is it easier to understand?

    You didn't say #compress?

  • Answer # 2

      

    However, I don't know why the playback time is required from the encoding speed.

    It is impossible to obtain the playback time from the encoding speed alone. The data size is also required.

    Do you remember distance ÷ time = speed to learn in elementary school?
    From now on, distance ÷ speed = time.

    Returning to bit playback,
    Number of bits ÷ Bit speed = Time
    is.
    (1.2 x 10 ^ 6 x 8) ÷ (64 x 10 ^ 3) = 9600 ÷ 64 = 150