SuperTutor

(15)

$15/per page/Negotiable

About SuperTutor

Levels Tought:
Elementary,Middle School,High School,College,University,PHD

Expertise:
Accounting,Business & Finance See all
Accounting,Business & Finance,Economics,Engineering,HR Management,Math Hide all
Teaching Since: Apr 2017
Last Sign in: 327 Weeks Ago, 4 Days Ago
Questions Answered: 12843
Tutorials Posted: 12834

Education

  • MBA, Ph.D in Management
    Harvard university
    Feb-1997 - Aug-2003

Experience

  • Professor
    Strayer University
    Jan-2007 - Present

Category > Programming Posted 18 May 2017 My Price 9.00

Part B: Readability

Part B: Readability

This program will read in the contents of a text file containing a normal text document and reorganize its contents by separately storing each sentence of the text.

For our purposes, the end of a sentence is marked by any word that ends with one of the three characters . ? !

Read in the lines of text, process them into words, and store them into an array (or, optionally, an ArrayList) of objects of your Sentence class. Each Sentence object should start out empty, and have words added to it as they are received from the file. Your Sentence class must contain at least the following:

  • text— (String) the text of the sentence;
  • wordCount— (int) the number of words in the sentence (only count words, where a word contains one or more letters); and
  • add(String word)— add the given word to the sentence.

If you are storing the sentences in an array, you can assume that the text file contains at most 1000 sentences.

Once you have read in the contents of the file, process it in the following way:

  1. Print the first five sentences in the file. Number the sentences according to their sequence in the document (the first sentence is number 1).
  2. Print the last five sentences in the file. Number the sentences according to their sequence in the document.
  3. Print summary statistics over the entire document, including the number of letters (counting only letters, not digits, spaces, or other punctuation), words, and sentences, and the Automated Readability Index of the text. The ARI is calculated as follows: and provides an estimate of the readability of the text according to its grade level. Round it to one decimal place.

For example, given the following text file:

Question 2. Read in the lines of text, process them into words, and store them
into an array (or, optionally, an ArrayList) of objects of your Sentence class.
Each Sentence object should start out empty, and have words added to it as they
are received from the file. Your Sentence class must contain at least the
following. If you are storing the sentences in an array, you can assume that the
text file contains at most 1000 sentences. Once you have read in the contents of
the file, process it in the following way!

your output would end like this (the first five sentences not shown):

The last five sentences:
(2) Read in the lines of text, process them into words, and store them into an
array (or, optionally, an ArrayList) of objects of your Sentence class.
(3) Each Sentence object should start out empty, and have words added to it as
they are received from the file.
(4) Your Sentence class must contain at least the following.
(5) If you are storing the sentences in an array, you can assume that the text
file contains at most 1000 sentences.
(6) Once you have read in the contents of the file, process it in the following
way!

Summary statistics:
Letters: 404
Words: 92
Sentences: 6
Readability: 6.9

 

 

Text that follows is based on the Wikipedia page on cryptography! Cryptography is the practice and study of hiding information. In modern times, cryptography is considered to be a branch of both mathematics and computer science, and is affiliated closely with information theory, computer security, and engineering. Cryptography is used in applications present in technologically advanced societies; examples include the security of ATM cards, computer passwords, and electronic commerce, which all depend on cryptography. Until modern times, cryptography referred almost exclusively to encryption, the process of converting ordinary information (plaintext) into unintelligible gibberish (i.e., ciphertext). Decryption is the reverse, moving from unintelligible ciphertext to plaintext. A cipher (or cypher) is a pair of algorithms which perform this encryption and the reversing decryption. The detailed operation of a cipher is controlled both by the algorithm and, in each instance, by a key. This is a secret parameter (ideally, known only to the communicants) for a specific message exchange context. Keys are important, as ciphers without variable keys are trivially breakable and therefore less than useful for most purposes. Historically, ciphers were often used directly for encryption or decryption, without additional procedures such as authentication or integrity checks. In colloquial use, the term "code" is often used to mean any method of encryption or concealment of meaning. However, in cryptography, code has a more specific meaning; it means the replacement of a unit of plaintext (i.e., a meaningful word or phrase) with a code word (for example, apple pie replaces attack at dawn). Codes are no longer used in serious cryptography - except incidentally for such things as unit designations (e.g., 'Bronco Flight' or Operation Overlord) - since properly chosen ciphers are both more practical and more secure than even the best codes, and better adapted to computers as well. Some use the terms cryptography and cryptology interchangeably in English, while others use cryptography to refer specifically to the use and practice of cryptographic techniques, and cryptology to refer to the combined study of cryptography and cryptanalysis. The Ancient Greek scytale (rhymes with Italy), probably much like this modern reconstruction, may have been one of the earliest devices used to implement a cipher. Before the modern era, cryptography was concerned solely with message confidentiality (i.e., encryption) - conversion of messages from a comprehensible form into an incomprehensible one, and back again at the other end, rendering it unreadable by interceptors or eavesdroppers without secret knowledge (namely, the key needed for decryption of that message). In recent decades, the field has expanded beyond confidentiality concerns to include techniques for message integrity checking, sender/receiver identity authentication, digital signatures, interactive proofs, and secure computation, amongst others. The earliest forms of secret writing required little more than local pen and paper analogs, as most people could not read. More literacy, or opponent literacy, required actual cryptography. The main classical cipher types are transposition ciphers, which rearrange the order of letters in a message (e.g., 'help me' becomes 'ehpl em' in a trivially simple rearrangement scheme), and substitution ciphers, which systematically replace letters or groups of letters with other letters or groups of letters (e.g., 'fly at once' becomes 'gmz bu podf' by replacing each letter with the one following it in the alphabet). Simple versions of either offered little confidentiality from enterprising opponents, and still don't. An early substitution cipher was the Caesar cipher, in which each letter in the plaintext was replaced by a letter some fixed number of positions further down the alphabet. It was named after Julius Caesar who is reported to have used it, with a shift of 3, to communicate with his generals during his military campaigns, just like EXCESS-3 code in boolean algebra. Encryption attempts to ensure secrecy in communications, such as those of spies, military leaders, and diplomats, but it has also had religious applications. For instance, early Christians used cryptography to obfuscate some aspects of their religious writings to avoid the near certain persecution they would have faced had they been less cautious; famously, 666 or in some early manuscripts, 616, the Number of the Beast from the Christian New Testament Book of Revelation, is sometimes thought to be a ciphertext referring to the Roman Emperor Nero, one of whose policies was persecution of Christians. There is record of several, even earlier, Hebrew ciphers as well. Cryptography is recommended in the Kama Sutra as a way for lovers to communicate without inconvenient discovery. Steganography (i.e., hiding even the existence of a message so as to keep it confidential) was also first developed in ancient times. An early example, from Herodotus, concealed a message - a tattoo on a slave's shaved head - under the regrown hair. More modern examples of steganography include the use of invisible ink, microdots, and digital watermarks to conceal information. Ciphertexts produced by classical ciphers (and some modern ones) always reveal statistical information about the plaintext, which can often be used to break them. After the discovery of frequency analysis (perhaps by the Arab polymath al-Kindi) about the 9th century, nearly all such ciphers became more or less readily breakable by an informed attacker. Such classical ciphers still enjoy popularity today, though mostly as puzzles (see cryptogram). Essentially all ciphers remained vulnerable to cryptanalysis using this technique until the invention of the polyalphabetic cipher, most clearly by Leon Battista Alberti around the year 1467 (though there is some indication of earlier Arab knowledge of them). Alberti's innovation was to use different ciphers (i.e., substitution alphabets) for various parts of a message (perhaps for each successive plaintext letter in the limit). He also invented what was probably the first automatic cipher device, a wheel which implemented a partial realization of his invention. In the polyalphabetic Vigenère cipher, encryption uses a key word, which controls letter substitution depending on which letter of the key word is used. In the mid 1800s Babbage showed that polyalphabetic ciphers of this type remained partially vulnerable to frequency analysis techniques. The Enigma machine, used in several variants by the German military between the late 1920s and the end of World War II, implemented a complex electro-mechanical polyalphabetic cipher to protect sensitive communications. Breaking the Enigma cipher at the Biuro Szyfrów, and the subsequent large-scale decryption of Enigma traffic at Bletchley Park, was an important factor contributing to the Allied victory in WWII. Although frequency analysis is a powerful and general technique, encryption was still often effective in practice; many a would-be cryptanalyst was unaware of the technique. Breaking a message without frequency analysis essentially required knowledge of the cipher used, thus encouraging espionage, bribery, burglary, defection, etc. to discover it. It was finally explicitly recognized in the 19th century that secrecy of a cipher's algorithm is not a sensible or practical safeguard; in fact, it was further realized any adequate cryptographic scheme (including ciphers) should remain secure even if the adversary fully understands the cipher algorithm itself. Secrecy of the key should alone be sufficient for a good ciphers to maintain confidentiality under attack. This fundamental principle was first explicitly stated in 1883 by Auguste Kerckhoffs and is generally called Kerckhoffs' principle; alternatively and more bluntly, it was restated by Claude Shannon as Shannon's Maxim - 'the enemy knows the system'. Various physical devices and aids have been used to assist with ciphers. One of the earliest may have been the scytale of ancient Greece, a rod supposedly used by the Spartans as an aid for a transposition cipher. In medieval times, other aids were invented such as the cipher grille, also used for a kind of steganography. With the invention of polyalphabetic ciphers came more sophisticated aids such as Alberti's own cipher disk, Johannes Trithemius' tabula recta scheme, and Thomas Jefferson's multi-cylinder (reinvented independently by Bazeries around 1900). Several mechanical encryption/decryption devices were invented early in the 20th century, and many patented, including rotor machines - most famously the Enigma machine used by Germany in World War II. The ciphers implemented by better quality examples of these designs brought about a substantial increase in cryptanalytic difficulty after WWI. The development of digital computers and electronics after WWII made possible much more complex ciphers. Furthermore, computers allowed for the encryption of any kind of data that is represented by computers in any binary format, unlike classical ciphers which only encrypted written language texts, dissolving the utility of a linguistic approach to cryptanalysis in many cases. Many computer ciphers can be characterized by their operation on binary bit sequences (sometimes in groups or blocks), unlike classical and mechanical schemes, which generally manipulate traditional characters (i.e., letters and digits) directly. However, computers have also assisted cryptanalysis, which has compensated to some extent for increased cipher complexity. Nonetheless, good modern ciphers have stayed ahead of cryptanalysis; it is usually the case that use of a quality cipher is very efficient (i.e., fast and requiring few resources), while breaking it requires an effort many orders of magnitude larger, making cryptanalysis so inefficient and impractical as to be effectively impossible. A credit card with smart card capabilities. The 3 by 5 mm chip embedded in the card is shown enlarged in the insert. Smart cards attempt to combine portability with the power to compute modern cryptographic algorithms. Extensive open academic research into cryptography is relatively recent - it began only in the mid-1970s with the public specification of DES (the Data Encryption Standard) by the NBS, the Diffie-Hellman paper, and the public release of the RSA algorithm. Since then, cryptography has become a widely used tool in communications, computer networks, and computer security generally. The present security level of many modern cryptographic techniques is based on the difficulty of certain computational problems, such as the integer factorisation problem or the discrete logarithm problem. In many cases, there are proofs that cryptographic techniques are secure if a certain computational problem cannot be solved efficiently. With one notable exception - the one-time pad - these proofs are contingent, and thus not definitive, but are currently the best available for cryptographic algorithms and protocols. As well as being aware of cryptographic history, cryptographic algorithm and system designers must also sensibly consider probable future developments in their designs. For instance, the continued improvements in computer processing power have increased the scope of brute-force attacks when specifying key lengths. The potential effects of quantum computing are already being considered by some cryptographic system designers; the announced imminence of small implementations of these machines is making the need for this preemptive caution fully explicit. Essentially, prior to the early 20th century, cryptography was chiefly concerned with linguistic patterns. Since then the emphasis has shifted, and cryptography now makes extensive use of mathematics, including aspects of information theory, computational complexity, statistics, combinatorics, abstract algebra, and number theory. Cryptography is also a branch of engineering, but an unusual one as it deals with active, intelligent, and malevolent opposition (see cryptographic engineering and security engineering); most other kinds of engineering need deal only with neutral natural forces. There is also active research examining the relationship between cryptographic problems and quantum physics (see quantum cryptography and quantum computing). The modern field of cryptography can be divided into several areas of study. The chief ones are discussed here; see Topics in Cryptography for more. Symmetric-key cryptography refers to encryption methods in which both the sender and receiver share the same key (or, less commonly, in which their keys are different, but related in an easily computable way). This was the only kind of encryption publicly known until 1976. One round (out of 8.5) of the patented IDEA cipher, used in some versions of PGP for high-speed encryption of, for instance, e-mail The modern study of symmetric-key ciphers relates mainly to the study of block ciphers and stream ciphers and to their applications. A block cipher is, in a sense, a modern embodiment of Alberti's polyalphabetic cipher: block ciphers take as input a block of plaintext and a key, and output a block of ciphertext of the same size. Since messages are almost always longer than a single block, some method of knitting together successive blocks is required. Several have been developed, some with better security in one aspect or another than others. They are the mode of operations and must be carefully considered when using a block cipher in a cryptosystem. The Data Encryption Standard (DES) and the Advanced Encryption Standard (AES) are block cipher designs which have been designated cryptography standards by the US government (though DES's designation was finally withdrawn after the AES was adopted). Despite its deprecation as an official standard, DES (especially its still-approved and much more secure triple-DES variant) remains quite popular; it is used across a wide range of applications, from ATM encryption to e-mail privacy and secure remote access. Many other block ciphers have been designed and released, with considerable variation in quality. Many have been thoroughly broken. See Category:Block ciphers. Stream ciphers, in contrast to the 'block' type, create an arbitrarily long stream of key material, which is combined with the plaintext bit-by-bit or character-by-character, somewhat like the one-time pad. In a stream cipher, the output stream is created based on an internal state which changes as the cipher operates. That state's change is controlled by the key, and, in some stream ciphers, by the plaintext stream as well. RC4 is an example of a well-known stream cipher; see Category:Stream ciphers. Cryptographic hash functions (often called message digest functions) do not necessarily use keys, but are a related and important class of cryptographic algorithms. They take input data (often an entire message), and output a short, fixed length hash, and do so as a one-way function. For good ones, collisions (two plaintexts which produce the same hash) are extremely difficult to find. Message authentication codes (MACs) are much like cryptographic hash functions, except that a secret key is used to authenticate the hash value on receipt. Symmetric-key cryptosystems typically use the same key for encryption and decryption, though this message or group of messages may have a different key than others. A significant disadvantage of symmetric ciphers is the key management necessary to use them securely. Each distinct pair of communicating parties must, ideally, share a different key, and perhaps each ciphertext exchanged as well. The number of keys required increases as the square of the number of network members, which very quickly requires complex key management schemes to keep them all straight and secret. The difficulty of establishing a secret key between two communicating parties, when a secure channel doesn't already exist between them, also presents a chicken-and-egg problem which is a considerable practical obstacle for cryptography users in the real world. Whitfield Diffie and Martin Hellman, inventors of public-key cryptography In a groundbreaking 1976 paper, Whitfield Diffie and Martin Hellman proposed the notion of public-key (also, more generally, called asymmetric key) cryptography in which two different but mathematically related keys are used - a public key and a private key. A public key system is so constructed that calculation of one key (the 'private key') is computationally infeasible from the other (the 'public key'), even though they are necessarily related. Instead, both keys are generated secretly, as an interrelated pair. The historian David Kahn described public-key cryptography as "the most revolutionary new concept in the field since polyalphabetic substitution emerged in the Renaissance". In public-key cryptosystems, the public key may be freely distributed, while its paired private key must remain secret. The public key is typically used for encryption, while the private or secret key is used for decryption. Diffie and Hellman showed that public-key cryptography was possible by presenting the Diffie-Hellman key exchange protocol. In 1978, Ronald Rivest, Adi Shamir, and Len Adleman invented RSA, another public-key system. In 1997, it finally became publicly known that asymmetric key cryptography had been invented by James H. Ellis at GCHQ, a British intelligence organization, in the early 1970s, and that both the Diffie-Hellman and RSA algorithms had been previously developed (by Malcolm J. Williamson and Clifford Cocks, respectively). The Diffie-Hellman and RSA algorithms, in addition to being the first publicly known examples of high quality public-key ciphers, have been among the most widely used. Others include the Cramer-Shoup cryptosystem, ElGamal encryption, and various elliptic curve techniques. See Category:Asymmetric-key cryptosystems. Padlock icon from the Firefox web browser, meant to indicate a page has been sent in SSL or TLS-encrypted protected form. However, such an icon is not a guarantee of security; a subverted browser might mislead a user by displaying a proper icon when a transmission is not actually being protected by SSL or TLS. In addition to encryption, public-key cryptography can be used to implement digital signature schemes. A digital signature is reminiscent of an ordinary signature; they both have the characteristic that they are easy for a user to produce, but difficult for anyone else to forge. Digital signatures can also be permanently tied to the content of the message being signed; they cannot be 'moved' from one document to another, for any attempt will be detectable. In digital signature schemes, there are two algorithms: one for signing, in which a secret key is used to process the message (or a hash of the message, or both), and one for verification, in which the matching public key is used with the message to check the validity of the signature. RSA and DSA are two of the most popular digital signature schemes. Digital signatures are central to the operation of public key infrastructures and many network security schemes (SSL/TLS, many VPNs, etc). Public-key algorithms are most often based on the computational complexity of "hard" problems, often from number theory. For example, the hardness of RSA is related to the integer factorization problem, while Diffie-Hellman and DSA are related to the discrete logarithm problem. More recently, elliptic curve cryptography has developed in which security is based on number theoretic problems involving elliptic curves. Because of the difficulty of the underlying problems, most public-key algorithms involve operations such as modular multiplication and exponentiation, which are much more computationally expensive than the techniques used in most block ciphers, especially with typical key sizes. As a result, public-key cryptosystems are commonly hybrid cryptosystems, in which a fast high-quality symmetric-key encryption algorithm is used for the message itself, while the relevant symmetric key is sent with the message, but encrypted using a public-key algorithm. Similarly, hybrid signature schemes are often used, in which a cryptographic hash function is computed, and only the resulting hash is digitally signed. The goal of cryptanalysis is to find some weakness or insecurity in a cryptographic scheme, thus permitting its subversion or evasion. Cryptanalysis might be undertaken by a malicious attacker, attempting to subvert a system, or by the system's designer (or others) attempting to evaluate whether a system has vulnerabilities, and so it is not inherently a hostile act. In modern practice, however, cryptographic algorithms and protocols must be carefully examined and tested to offer any assurance of the system's security (at least, under clear - and hopefully reasonable - assumptions). It is a commonly held misconception that every encryption method can be broken. In connection with his WWII work at Bell Labs, Claude Shannon proved that the one-time pad cipher is unbreakable, provided the key material is truly random, never reused, kept secret from all possible attackers, and of equal or greater length than the message. Most ciphers, apart from the one-time pad, can be broken with enough computational effort by brute force attack, but the amount of effort needed may be exponentially dependent on the key size, as compared to the effort needed to use the cipher. In such cases, effective security could be achieved if it is proven that the effort required (ie, 'work factor' in Shannon's terms) is beyond the ability of any adversary. This means it must be shown that no efficient method (as opposed to the time-consuming brute force method) can be found to break the cipher. Since no such showing can be made currently, as of today, the one-time-pad remains the only theoretically unbreakable cipher. There are a wide variety of cryptanalytic attacks, and they can be classified in any of several ways. A common distinction turns on what an attacker knows and what capabilities are available. In a ciphertext-only attack, the cryptanalyst has access only to the ciphertext (good modern cryptosystems are usually effectively immune to ciphertext-only attacks). In a known-plaintext attack, the cryptanalyst has access to a ciphertext and its corresponding plaintext (or to many such pairs). In a chosen-plaintext attack, the cryptanalyst may choose a plaintext and learn its corresponding ciphertext (perhaps many times); an example is gardening, used by the British during WWII. Finally, in a chosen-ciphertext attack, the cryptanalyst may choose ciphertexts and learn their corresponding plaintexts. Also important, often overwhelmingly so, are mistakes (generally in the design or use of one of the protocols involved; see Cryptanalysis of the Enigma for some historical examples of this). Cryptanalysis of symmetric-key ciphers typically involves looking for attacks against the block ciphers or stream ciphers that are more efficient than any attack that could be against a perfect cipher. For example, a simple brute force attack against DES requires one known plaintext and 255 decryptions, trying approximately half of the possible keys, to reach a point at which chances are better than even the key sought will have been found. But this may not be enough assurance; a linear cryptanalysis attack against DES requires 243 known plaintexts and approximately 243 DES operations. This is a considerable improvement on brute force attacks. Public-key algorithms are based on the computational difficulty of various problems. The most famous of these is integer factorization (eg, the RSA algorithm is based on a problem related to factoring), but the discrete logarithm problem is also important. Much public-key cryptanalysis concerns numerical algorithms for solving these computational problems, or some of them, efficiently. For instance, the best known algorithms for solving the elliptic curve-based version of discrete logarithm are much more time-consuming than the best known algorithms for factoring, at least for problems of more or less equivalent size. Thus, other things being equal, to achieve an equivalent strength of attack resistance, factoring-based encryption techniques must use larger keys than elliptic curve techniques. For this reason, public-key cryptosystems based on elliptic curves have become popular since their invention in the mid-1990s. While pure cryptanalysis uses weaknesses in the algorithms themselves, other attacks on cryptosystems are based on actual use of the algorithms in real devices, and are called side-channel attacks. If a cryptanalyst has access to, say, the amount of time the device took to encrypt a number of plaintexts or...

Attachments:

Answers

(15)
Status NEW Posted 18 May 2017 03:05 AM My Price 9.00

-----------

Not Rated(0)
Relevent Questions