Sunday, April 29, 2012

Module 13 Blog

     The Window manager is an essential components to any machine. It is a layer of software that interprets mouse clicks and keyboard strikes and tells the actual desktop environment where you clicked, how many times, and what it did. A Desktop Environment is merely the customizable layer of the Windows Manager. For instance, My computer uses Metacity as its main Windows Manager, though its desktop environment is Gnome Classic. Besides Metacity, (which seems to be the only one that runs ubuntu well on my machine), there are many window managers that can be used with many different desktop environments. It is truly the most customizable interface for home computers at the moment. Some of note are;
Compiz - replaces Mutter/Metacity on Gnome and other systems, supports 3D and is aimed at more advanced users. Supports Tabbed Windows where Metacity does not.
KWin - Default Window manager for Kubuntu and other KDE systems, has many bells and whistles.
Fluxbox - Not meant for those familiar with windows. This environment goes in a strange new way with desktop environments and omits all bars from the screen, instead relying on a very complex context menu accessed via right clicking on the desktop and other means. It can be very fast and efficient if you know how to use it.
Openbox - main Windows Manager for LXDE, you would find this in use on a Lubuntu system.

     Unity got a lot of mixed reviews when it first came out, due to it's overly mac-like and casual look. Though it was in no way flawed, it strayed too far from the tradition desktop paradigm "folders, desktop, trash bin, etc." replacing this instead with a huge deformed taskbar that took up the entirety of the left side of the screen. Some people say this became prominent during the netbook boom a few years back, though I would attribute it to the dev team wanted to mimic apple and its design. It does seem a bit like windows 7 what with the dynamic icons that shiver, wiggle, and shake.

Sunday, April 15, 2012

Module 11 Blog

     Network Neutrality, in the shortest description, means absolute freedom for the consumer to access information and not be subjected to information control based on commercial or political reasons. I am strongly in favor of Net Neutrality.

      In 2005, the Broadband Policy Statement was released by the FCC that essentially laid out the groundwork of how ISP's provide service. The statement emphasized legal web activities and competition among ISP's to provide fair and affordable coverage.

     Comcast went against the policies of the 2005 FCC statement by essentially blackmailing video streaming site, "Level 3." In the Broadband Policy Statement is says, "customers should have access the lawful Internet content of their choice." Comcast wouldn't let it's customers access any of Level 3 unless the company coughed up more money.

     Opponents of Net Neutrality argue that ISP's are buisnesses like any other, and if you don't want to contend with their policies, then don't buy their service. I counter this argument by saying that the internet is a new medium, and has only had roughly 20 years of mainstream attention, and in that time new problems have arisen that haven't been dealt with. It would be like if a phone company placed restriction on who you could call. Wouldn't it be fine then if a phone company placed restrictions on who you could call? What if you signed up for Vonage, but they had a deal with Pizza Hut to keep people from eating at Papa John's? You can't call Papa John's to place an order on their network, they make more money and control the flow of information.

    

     To think that any major business would sacrifice customer satisfaction and fairness for money is absurd. An ISP can make more money if it restricts access to users on another ISP or using a commerical service that either directly, or indirectly affects their own profits. You are entitled to the speeds and accessibility that you pay for! "If I pay to connect to the Net with a certain quality of service, and you pay to connect with that or greater quality of service, then we can communicate at that level." (1)
     There are fewer restrictions regarding Quality of Service (QoS) restrictions for mobile internet providers, this is most likely the reason for those incessant ads on TV about 3G and 4G speeds, data caps, and service models.  I still believe that ISP's should run their business to make a profit, thus ensuring quality service to the most amount of people. There is a difference between fair service models, and ones that gouge consumers at every opportunity with overage fees, sleazy buisness practices, and heavily inflated prices.
     The ISP's are responsible for ensuring reliability. Your speed of access is not conditional, you are entitled to what you pay for, regardless of the provider's opinion of your usage. They should not be able to block, speed, up, or slow down data streams based on the content, i.e. youtube, netflix, or competing services on another network.


   Net Neutrality means being free of commercially and politically fueled desires to control your access to the web. This is a major concern when deciding how to manage rampant piracy on the net. There are no such provision made on the federal level for internet censorship in America, though the rights of ISP to discriminate is debatable. The failed bills SOPA, and PIPA came close to breaking the open internet in the interest of decreased piracy. I feel that it is protected under the constitution to access any information that you want.
     To that effect, I think the End User receiving data should be able to access any site, network, or server that is connected to the WWW regardless of content, or bandwidth, and operating within the confines of the legal and fair. Unfortunately, if people can get away with a crime, they will do it. The main opponents of the open internet are lawmakers and ISP's that want to curb piracy through Deep Packet Inspection (DPI), whereby they would monitor data streams to determine what sort of information is being sent/received. ISP's also want to use this information in order to create traffic rules and regulations for the sole reason of increasing profits. (2) 
     Telecommunications of any kind should be blind and anonymous. Though, the actual speed/bitrate of a consumer can still be monitored on the surface and remain anonymous, providing only clues to what the end user is accessing. While I think internet piracy is a major concern in terms of intellectual property rights, I don't think anyone should be able to take away your anonymity on the internet, just as no one should deprive you of your privacy while making a phone call or reading a book. Your anonymity, to not just your identity, but also what you're doing is important; as stated above, any information about what you do on the web and the technical characteristics of such can be used by your ISP to further commerical greed.
     As it stands today, the authorities cannot monitor any telecommunication without a warrant obtained for reasons of reasonable suspicion.  For the average end user of an ISP, 100 simultaneous connections downloading 20-100 Gigabytes is suspicious of piracy by means of a peer-to-peer (P2P) client. In cases like that, ISP's should be able to look at the packets being sent to ensure that they aren't contributing to software/media piracy, though not at their own discretion, for this could lead to a corrupt company making unfair exceptions and privileges for whomever they please. Remember, P2P is not in itself piracy, this technology has many useful legal functions, but it's also the main tool for acquiring stolen intellectual property. Since P2P technology doesn't rely on a central source to distribute data, but rather on every end user participating in the swarm to seed data as well as receive it, piracy is difficult to detect, masquerading as innocent data streams from one person to another. A .torrent file (the index of all the data in the original file) must be acquired prior to starting the download, so the P2P client knows how big it is, what pieces go where, and what the finished file will look like.
     I propose a system by which a non-human entity is programmed to perform DPI without record or human interference. This would allow for "red flags" in which the monitoring system has detected a pattern, characteristic of  illegal downloads, i.e. the same chunks of data making their way across the country to seemingly random End Users without any common location, time of access, or purpose. At this point, an actual human from the company may step in and perform DPI to determine what the data contains and whether legal intervention is required.
     Net neutrality is the polar opposite to the internet situation in China, where the content-restrictions-bottleneck is intertwined with the government and not individual ISP's, this is done to control political information. SOPA/PIPA aims to allow the government to lock out entire domains that provide .torrent files describing stolen intellectual property, though it would effectively allow for the federal government to deny access to any source that it did not like. Besides, .torrent files are miniscule in size and could be transmitted  by any means on the web, so these bills would be less than useless in curbing piracy, it would just make doing it a little more time-consuming for the pirate. It's a slippery slope though, as soon as the government has the power to cut off access wherever it wants, where do they stop? If people in Europe or other parts of the world publish anti-American sentiments, what's to stop the government from banning access? It gives too much discretion and power to the government and ISP's. Intellectual property theft doesn't fall under "freedom of speech," and SOPA/PIPA was never intended to squash your rights to read or write anything, but it's a slippery slope. Piracy is not freedom of speech, but Network neutrality is! Being able to visit any site you want without fear of reprisal is 100% protected under the First Amendment. Whether or not you choose to act on that information in an illegal way however is a crime, and is not protected in the slightest. It's just like if you walked into a library and checked out a book on bomb-making. You should have the right to read about arson to your hearts content, but that doesn't mean it's ok to start making pipe bombs in your basement.

Keep the net free! (Free as in freedom)

We really need a nationwide mesh network :/ free information for all!


(1) http://dig.csail.mit.edu/breadcrumbs/node/144
(2) http://arstechnica.com/hardware/news/2007/07/Deep-packet-inspection-meets-net-neutrality.ars/2 [last paragraph]

Sunday, April 8, 2012

Module 10 Blog

     I've been using Ubuntu for a solid month now, and I never want to go back to Windoze! It is fair to say I love this OS, but I'm still unlearning Windows. I find myself ever more curious about SSH technology as I would love to be able to use my desktop computer on the go through my laptop.

     1) The sorts of jobs that would benefit from experience in Linux are mostly network maintenance/administration, and software engineer. Personally, I think I would be better suited as a software engineer. I am familiar with C and C++, and I've always thought that object-oriented language are easy to understand. depending on who would hire me to write software, I would most likely be part of a team.
         A local computer repair shop would benefit from having a staff knowledgeable about Linux, though admittedly most of their customers would be windows users and wouldn't need the help.

     2) Many of these jobs are in companies switching to Linux for its reliability and stability, the demand for those with Linux experience is steadily increasing. Just as well, since there is a huge store of information on the web on how to solve problems related to specific firewall settings, errors, and specialized software one could run into. This information simply doesn't exist for Windows, if it does, it's written by a microsoft flunkie and it's most likely difficult to find.

     3) The salary for a career as a linux system adminranges anywhere from 75,000-100,000 per year. A Linux software engineer (depending on how much he/she works) can make anywhere from 60,000-120,000 per year. I don't really care about the money. It's nice to hear of success stories, of people making 6 figures a year, but I would just be happy with a comfortable secure job where I can be creative and problem solve!

     4) The only real requirement for a programmer is that your code runs and is free of bugs. To be a network administrator for a linux-based network requires upwards of 5+ years Linux System administration and engineering experience, DNS fundamentals, working knowledge of complex web hosting configuration components, including firewalls, load balancers, web and database servers and virtualization software.

     5)The most interesting thing to me is that since we live in the information age, there is a good chance that a systems analyst or network administrator need not come into work every day, or even every week. Some of them work from home entirely, or ocassionally, but it's still a distinct possibility.
        Microsoft trying to indoctrinate best buy workers with anti-linux lies: http://linuxologist.com/wp-content/uploads/2009/09/Linux-MS-FUD5.jpg

Friday, March 30, 2012

Module 9 Blog

I am glad to hear of Brazil's decision to encourage open-source software. It's one thing to consider an end user deciding to choose the "de facto" OS, such as Windows, but it's another thing for buisnesses. I for one feel that any business that lacks a huge amount of start-up capital, especially tech-oriented buisness, are at a huge disadvantage because of esablished "norms" in the computing world, where their only viable option is to use a close-sourced OS that can cost upwards of $150.00! Linux is free, easily modifiable to suit the needs of a specialized buisness, and more stable (not only resource management, but malware and viruses too). Many of the computers in Brazil are old, even positively ancient! Linux is a fantastic OS because it's not hardware dependent. There is always a distro out there that can run on whatever hardware you have laying around.

If I were going to build a network to work the registers and keep track of the inventory in a Walgreens (I work there, so this is from experience), and I knew that the hardware owned by the store was 20 years old, I would research which version of linux could run on 66MHz Pentium 1 with 32MB of available ram and driver support for scanner, printers, and specialized cashier displays. Unfortunately everything must run Windoze! So these doddering old computers are being wasted running a weird version of MS-DOS. These were installed in a time without ubiquitous access to the internet, so it was difficult to get the word out about free, open-source software, let alone transmit it.

There is no excuse for the modern day monopoly on the operating system market.

Sunday, March 11, 2012

Module 7 Blog (Week 5)

How to go about producing Opensource software:

To produce anything that could be considered "open source," it must have the proper license. The primary choice to ensure that our product is "free as in freedom," is to have it comply with the GNU GPL, or General Public License.

It is a copyleft license, which means that the initial product, as well as its derivatives are free to distribute and change by anyone for any reason. The GNU license is maintained by the Free Software Foundation (FSF). You can ensure that your product is considered open source but contacting the FSF for a copy of their license, or by using a modified copy of the GPL that does not mention GNU. The license is itself not copyleft, but anything bearing the the license is. 

I hope this clears up any confusion you may have had about open source software. Thank you for your time.


Sunday, February 26, 2012

Module 6 Blog (Blog #4)

Linux kernels are named in an X.Y.Z fashion, with the X denoting the major version of the kernel. My terminal reported that the machine for our labs is running version 2.6.24-24-generic, which means it's the major version 2.0 (1996), stable version X.6.Z, and trivial version X.Y.24-24-generic. The numbers are just ways of showing changes in new versions. When you refer to a "version of the linux kernel" it's usually just the X number. Sometimes a 4th number is implemented to denote a fix of a major error. After the 20th anniversary of linux however, they jumped from 2.6.39 to 3.0.

All in all, it's just a system for the easy identification of new releases of the linux kernel, making it easy for developers to write code that will run using the linux OS.

UPDATE:  The Jump from 2.6.39 to 3.0 had little to do with major changes, but rather as the release coincided with the 20th anniversary of linux 1.0 being released. An example of a major error that would facilitate a 4th number would be when version 2.6.8 had a major error with its Network File System code and an immediate fix was needed. Not enough to warrant an entirely new number i.e. 2.6.9, but rather 2.6.8.1.

Multiboot computer

My 6-year-old laptop came with Windows Vista installed, but the hardware was incapable of running it well. It now runs Windows 7, Windows XP, and Ubuntu 11.10. Still wondering which bootloader I should use, grub or the windows bootloader. I'm sure the choice will become clear when I install Fedora alongside Ubuntu     XD