Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Wiki Markup
The zip algorithm is capable of producing very large compression ratios \[[Mahmoud 2002|AA. Bibliography#MahmoudReferences#Mahmoud 02]\]. Figure 2-1 shows a file that was compressed from 148MB to 590KB, a ratio of more than 200 to 1. The file consists of arbitrarily repeated data: alternating lines of _a_ characters and _b_ characters. Even higher compression ratios can be easily obtained using input data that is targeted to the compression algorithm, or using more input data (that is untargeted), or other compression methods.

...

<ac:structured-macro ac:name="unmigrated-wiki-markup" ac:schema-version="1" ac:macro-id="357aaa9777eeb38c-a72950ea-4f304589-bd4fb430-29820ae6a0e85ba5782204b7"><ac:plain-text-body><![CDATA[

[[Mahmoud 2002

AA. Bibliography#Mahmoud References#Mahmoud 02]]

[Compressing and Decompressing Data Using Java APIs

http://java.sun.com/developer/technicalArticles/Programming/compression/]

]]></ac:plain-text-body></ac:structured-macro>

...