#P1898. Entropy

Entropy

本题没有可用的提交语言。

Description

In 1948 Claude E. Shannon in "The Mathematical Theory of Communication"' has introduced his famous formula for the entropy of a discrete set of probabilities p1, ... , p

n

:

H=-∑pilog2pi

We will apply this formula to an arbitrary text string by letting pi

be the relative frequencies of occurrence of characters in the string. For example, the entropy of the string "Northeastern European Regional Contest" with the length of 38 characters (including 3 spaces) is 3.883 with 3 digits after decimal point. The following table shows relative frequencies and the corresponding summands for the entropy of this string.

Your task is to find a string with the given entropy.

Input

The input consists of a single real number H (0.00 <= H <= 6.00) with 2 digits after decimal point.

Output

Write to the output file a line with a single string of at least one and up to 1000 characters '0'-'9', 'a'-'z', 'A'-'Z', '.' (dot), and spaces. This string must have the entropy within 0.005 of H.

3.88
Northeastern European Regional Contest

Source

Northeastern Europe 2003