毕业论文论文范文课程设计实践报告法律论文英语论文教学论文医学论文农学论文艺术论文行政论文管理论文计算机安全
您现在的位置: 毕业论文 >> 论文 >> 正文

计算机Internet的历史英文文献和翻译 第4页

更新时间:2014-5-30:  来源:毕业论文
A web spider is a program used by search engines that goes from page to page, following any link it can possibly find. This means that a search engine can literally map out as much of the Internet as it's own time and speed allows for.
An indexed collection uses hand-added links. For instance, on Yahoo's site. You can click on Computers & the Internet. Then you can click on Hardware. Then you can click on Modems, etc., and along the way through sections, there are sites available which relate to what section you're in.
Metasearch searches many search engines at the same time, finding the top choices from about 10 search engines, making searching a lot more effective.
Once you are able to use search engines, you can effectively find the pages you want.
With the arrival of networking and multi user systems, security has always been on the mind of system developers and system operators. Since the dawn of AT&T and its phone network, hackers have been known by many, hackers who find ways all the time of breaking into systems. It used to not be that big of a problem, since networking was limited to big corporate companies or government computers who could afford the necessary computer security.
The biggest problem now-a-days is personal information. Why should you be careful while making purchases via a website? Let's look at how the internet works, quickly.
The user is transferring credit card information to a webpage. Looks safe, right? Not necessarily. As the user submits the information, it is being streamed through a series of computers that make up the Internet backbone. The information is in little chunks, in packages called packets. Here's the problem: While the information is being transferred through this big backbone, what is preventing a "hacker" from intercepting this data stream at one of the backbone points?
Big-brother is not watching you if you access a web site, but users should be aware of potential threats while transmitting private information. There are methods of enforcing security, like password protection, an most importantly, encryption.
Encryption means scrambling data into a code that can only be unscrambled on the "other end." Browser's like Netscape Communicator and Internet Explorer feature encryption support for making on-line transfers. Some encryptions work better than others. The most advanced encryption system is called DES (Data Encryption Standard), and it was adopted by the US Defense Department because it was deemed so difficult to 'crack' that they considered it a security risk if it would fall into another countries hands.民营快递业的现状和发展策略研究 
 A DES uses a single key of information to unlock an entire document. The problem is, there are 75 trillion possible keys to use, so it is a highly difficult system to break. One document was cracked and decoded, but it was a combined effort of 14,000 computers networked over the Internet that took a while to do it, so most hackers don't have that many resources available.

上一页  [1] [2] [3] [4] [5] [6] 下一页

计算机Internet的历史英文文献和翻译 第4页下载如图片无法显示或论文不完整,请联系qq752018766
设为首页 | 联系站长 | 友情链接 | 网站地图 |

copyright©youerw.com 优文论文网 严禁转载
如果本毕业论文网损害了您的利益或者侵犯了您的权利,请及时联系,我们一定会及时改正。