Wednesday, December 2, 2009

Given an eight-bit bitmap graphics file, devise an algorithm to convert the file into a two-bit ASCII approximation

Assume that the file format is one byte for every pixel in the file, and that the approximation will produce one ASCII character of output for each pixel. This problem is easier to solve than it sounds. This is one of the tricks used in technical interview questions. Problems may be obscured or made to sound difficult. Don't be fooled! Take the time to think about the core of the problem. In this case, all you want is an algorithm for reading the values in a file and outputting characters based upon those values.
Eight-bit numbers can be in the range from 0 to 255. Two-bit numbers are in the range from 0 to 3. Basically, we want to divide the 256 numbers specified by an eight-bit number into four ranges, which can be indicated by a two-bit number. So, divide the range of 0 to 255 uniformly into four separate ranges: 0 to 63, 64 to 127, 128 to 191, and 192 to 255.
You then have to assign an ASCII character to each of those four ranges of numbers. For example, you could use "_", "~", "+", and "#". Then, the algorithm is as follows:
1. Open the file.
2. For every byte in the file:
a. Read in one byte.
b. If the value is in the range 0..63, we'll print '_'.
c. If the value is in the range 64..127, we'll print '~'.
d. If the value is in the range 128..191, we'll print '+'.
e. If the value is in the range 192..255, we'll print '#'.
3. Close the file.

No comments:

Post a Comment