Sabtu, 27 Desember 2008

Shopping For a Computer - Things to Consider

Buying a Computer is the first step that anyone must take to join the Computer and Internet Revolution. Buying a computer is like buying a car, approach it the same way see it as a big investment because buying a computer is a relatively infrequent process for most people. However, purchasing the accessories that go along with them happens more often than most of us would like so it needs to be taken into consideration when researching the available products.

Computers can do much more than you think and will be a significant part of everyone's future. Computers vary in price according to how powerful and functionality and have become an integral part of our everyday lives, from basic typing to shopping on the internet. Computers are here to stay so everyone should consider learning how to perform their own pc upgrades and minor repairs.

Computers are like cars: faster is not automatically better, they are complex systems and looking at one feature while ignoring the rest is not an intelligent way to buy a car or computer. Computers should be cleaned and checked every two to three months and computers are getting cheaper all the time. They don't break down very often, but they do have frequent small problems and computers offer more than just being a business aid these days.

But what a lot of people don't realize when they start looking into buying a computer is that it's common for a lot of the big brand companies to sell very out-of-date computers in their lower price ranges. I'm not kidding about this -- most of the big computer companies out there, when they sell their least expensive computers, are trying to unload old inventory that's been collecting dust on their shelves for a long time. And the more disturbing part of this is that from what I've heard, those computers often have parts in them that are *known* to be bad parts.

Now you'd think these bad parts would be thrown away, but no -- from what I've heard, what happens is they still sell them to the big computer companies at a discount, and those companies put them into their computers anyway.

So if you can find a reputable local company that sells computers they assemble themselves, you'll get a well built computer for a lot less because you're not paying for the brand name. So do your research well.

Buying a computer is a big deal for a lot of people, and understandably so and it is a very personal decision, and it is no longer like buying a tool, but rather an emotional purchase of a beautiful piece of furniture, adding character to the space surrounding it.

Buying a computer is not always at the top of your shopping list, especially with the current jump in food and gas prices and it is always an exercise in compromise.

All the above can be summarised by saying buying a computer is like buying a car or a home, give it the same respect and research and you will not go far wrong.

Kamis, 25 Desember 2008

peer to peer networking

peer to peer

Peer to peer was originally used to describe the communication of two peers and is analogous to a telephone conversation. A phone conversation involves two people (peers) of equal status, communication between a point-to-point connection. Simply, this is what P2P is, a point-to-point connection between two equal participants.

The Internet started as a peer-to-peer system. The goal of the original ARPANET was to share computing resources around the USA. Its challenge was to connect a set of distributed resources, using different network connectivity, within one common network architecture. The first hosts on the ARPANET were several U.S. universities, e.g., the University College of Los Angeles, Santa Barbara, SRI and University of Utah. These were already independent computing sites with equal status and the ARPANET connected them as such, not in a master-slave or client-server relationship but ratheras equal computing peers.

From the late 1960s until 1994, the Internet had one model of connectivity. Machines were assumed to be always switched on, always connected and assigned permanent IP addresses. The original DNS system was designed for this environment, where a change in IP address was assumed to be abnormal and rare, and could take days to propagate through the system. However, with the invention of Mosaic, another model began to emerge in the form of users connecting to the Internet from dial-up modems. This created a second class of connectivity because PCs would enter and leave the network frequently and unpredictably. Further, because ISPs began to run out of IP addresses, they began to assign IP addresses dynamically for each session, giving each PC a different, possibly masked, IP address. This transient nature and instability prevented PCs from being assigned permanent DNS entries, and therefore prevented most PC users from hosting any data or network-facing applications locally.

For a few years, treating PCs as clients worked well. Over time, though, as hardware and software improved, the unused resources that existed behind this veil of second-class connectivity started to look like something worth getting at. Given the vast array of available processors mentioned earlier, the software community is starting to take P2P applications very seriously. Most importantly, P2P research is concerned in addressing some of the main difficulties of current distributed computing: scalability, reliability, interoperability.

I work as technical writer and marketing specialist for software development company Dana Consulting Inc. which offers the ultimate network monitoring online service at Dotcom-Monitor.com