Von Neumann was an important figure due to his contributions to the basics of
computer. “John Von Neumann is usually considered to be the developer of the
computer as we know it’s today” (Englander, 2009, p.192). For the past six
decades to present time, computer architectures abide by the general concept
created by Neumann. He was a consultant on the ENIAC project and he presented a
computer that could store both program and data in its memory which became a
break through and had significant improvement over the ENIAC design; this
became one of his major contribution. According to the textbook, although other
computer architectures have been built, no one has had as much commercial
success to date. Which means there’s a high probability that Neumann’s
architecture will be the standard architecture for computers for in the future.
to (Englander, 2009), “The “cloud” is intended to indicate only that there’s a
link of some kind between the client and the server” (p. 51), which means the
communication channel represents in this case, a communication between the web
browser and the web server. For a huge data centre operated by can still be
used to represent a communication Channel. Why? Because when do user types the
URL, it will go to the data centre instead of another computer. One of the only
thing that can change from my home network and the connection to the data
centre is the fact that there would be more security involved with the data
centre, because it has a larger database. The user just needs to access the
information by placing a request to Google server and since the server maintains
numerous data centres, there is no requirement for the user to know about the transitional
request to response process. This is totally different from the user just
important and URL and sending a message to a different computer that has a web
server software. If required data is not accessible, then the data will search
other servants until the required better is obtained. The user doesn’t know the
whole process but might feel that they’re getting the data from the same system
which means the cloud symbol at figure 1.2 represents the information
transmission. On the other hand, the other computer will send an output of a
web page, which will then display the computer’s web browser on your screen.
This process is known as Hypertext Transfer Protocol (HTTP). “This is used as a
standard for web message exchanges” (p. 9).
states, “Note that the exact nature of the channel is not important for
discussion”, that’s because in figure 1.2, the message requested to be passed
from the web browser to the web server is simple which means that there’s no adjustment
needed. Abstraction according to (Brown, 2016), “Is a mental mode that removes unneeded
complexity” which means it represents the important features of a system without
stating the intricate parts if the entire system. Which means in this case, the
user doesn’t need to know how a message is passing from one host to the server
or which channel is used for communication. They just need to know that if they
take a picture it’ll automatically upload not how it uploaded.
hardware required for storing and retrieving the fault is the hard disk.
If company A is going out to give a presentation to an external company about a
project. They don’t need to tell Company B every specific minor department handling
the project so the example in figure 2.5 (a) would be good but 4 the above
detailed example if Company A decides to have an internal project, the
presentation has to have a detailed plan involving all major and minor departments.
sampling theorem is a sample that is double the maximum rate a user wants to reproduce.
A time varying signal is sampled at a rate that’s at least double the frequency
of the maximum frequency also know as A-to-D converter. It is a guide for
converting analog signals. When you have a higher sampling rate it means that
there is a digital signal that is higher quality and close to the signal that
is being sampled. This can also translate to large files being stores which
means more bandwidth will be needed to transmit the files. The original time
signal can recovered from the samples.