2011

HOW TO : Dowload content from Oracle Metalink (Support) using wget

The usual process for a DBA to download files from Oracle Metalink (support) site is

  • Login to Metalink from his/her workstation
  • Download the file
  • Upload the file to the database server
  • Use the file

Say your database is in a data center and your workstation doesn’t have high speed connectivity to the data center, you can use the following trick to download content to a l[u]inux server in the data center that has Internet connectivity (and hopefully it is not your database server 🙂 ).

  • Log into Metalink from your workstation
  • Grab the link to the file/content you want to download (for example, we recently tried to download clusterware for Oracle 11G, and the link was http://download.oracle.com/otn/linux/oracle11g/linux.x64_11gR1_clusterware.zip)
  • Log into a server in your data center (it should have connectivity to the Internet and also to your database server)
  • Download the file using wget

[bash]wget http://download.oracle.com/otn/linux/oracle11g/linux.x64_11gR2_clusterware.zip –user ORACLE_ID –password ORACLE_ID_PASSWORD[/bash]

  • Replace the link with the link to your content and use your Oracle ID and password.
  • The file downloaded will have a strange name since wget  appends the sessionID to the end of the file. In the example I used above, the name of the file was “linux.x64_11gR2_clusterware.zip\?e\=1297470492\&h\=a66b265cc967a68c611052cb8e54356f
  • Rename the file and strip off the unnecessary data in the name using mv

HOW TO : Capture HTTP Headers using tcpdump

Quick how to on capturing HTTP headers using tcpdump on a web server (running Linux).

    • On the web server, issue the following command

      [bash] tcpdump -s 1024 -C 1024000 -w /tmp/httpcapture dst port 80 [/bash]

        • Stop the capture by issuing the break command (ctrl + c)
        • Open the capture file (httpcapture in this example) in wireshark and check out the headers under the  the HTTP protocol

        Overheard : Comment about being a true Motivater

        I was interviewing a candidate for a job opening at my work. I asked him how he motivates his team and he made this comment about how you truly measure if you can motivate a person

        Think of a 16 year old working part time at a retailer. She is just working to earn money to buy some lipstick or perfume. Think how you can motivate her to work on a thanksgiving weekend!!

        RESOLUTIONS : 2011 : January Update

        As I mentioned here, I have made some resolutions for 2011. As with any good task list, it is worthless unless you take a look at it periodically and update it :). I am going to publish an update on each one of the resolutions every month. Here goes the first one

        1. Loose weight (AKA loose gut)
          • I am practicing part of the diet proposed by Tim Ferris in his Four Hour Body book. I am eating 2 egg whites for breakfast and than eating a small meal every 4 hours. I haven’t gone completely into the whole “white” carb diet he proposes though.
          • I also started tracking my weight and diet religiously on a daily basis. This is another of Tim’s ideas. He says that by tracking your weight everyday, you subconsciously start making better choices in terms of the food you eat. I this it makes sense :). I am tracking the data in a Google spreadsheet. Here is a chart of my weight for the last one month I started out at 194 lbs and am not at 188 lbs. Hopefully I will be able to keep this downward trend.
          • I also started working out (thanks to Jhanvi). We are working out at least 2 times a week.
        2. Increase web traffic to kudithipudi.org
          • I started posting more content on the site. I posted 8 articles in January.
          • No particular strategy other than writing more content, which will hopefully bring more traffic.
          • Have the following topics to write on (and some of them have been pending for a long time)
            • Moving your life to the cloud
            • Setting up a virtual server on the Rackspace Cloud Infrastructure
            • Configuring syslog-ng
            • Configuring nginx to reduce resource utilization on Linux server
        3. Achieve CISSP certification
          • No progress on this one at all.
        4. Go on a vacation
          • Jhanvi and I planned to go to the travel and adventure expo, that was held in Rosemont last weekend, but we got too lazy :).
          • Our trip to India (and it doesn’t count as a vacation 🙁 ) is planned for April.

        Progress on 2 out of the 4 resolutions!!.. Not bad :).

        PS : Thanks for all the support I have been getting on the first resolution :). I didn’t realize the situation was so bad :).

        HOW TO : Configure Cache Expiration in Apache

        Cache servers depend on cache control headers provided by the web server. Essentially, the web server (based on the configuration) specify’s what content is cache-able and for how long. (Note: Some of the cache servers might ignore this and have a default cache period for specific content. But that is not for another post 🙂 )

        Here is a quick and dirty way to configure Apache 2.x server to enable cache control settings on all content in a directory

        [bash]
        ExpiresActive On
        <Directory "/var/www/html/static">
        Options FollowSymLinks MultiViews
        Order allow,deny
        Allow from all
        ExpiresDefault "modification plus 1 hour"
        </Directory>
        [/bash]

        This configuration tells apache to enable cache headers for all content in the /var/www/html/static folder. The cache expiration is set to expire 1 hour from the modification time of the content.

        Analytics in the Cloud : Not there yet

        I attended a webinar hosted by Deepak Singh from Amazon’s Web Service group on analytics in the cloud. He made a very compelling case for utilizing the cloud to build out your analytics infrastructure. Esp with the growing data sizes that we deal with now, I think it makes absolute sense. You can utilize different software stacks and grow (and shrink) your hardware stack as required. Great stuff..

        But there is a catch. Most of the data generated by current organizations is “inside” their perimeters. Whether it is the OLAP database collecting all your data or that application that spews gigabytes of logs, most of the data is housed in your infrastructure. So if you want to use the cloud to perform analytics on this data, you have to first transfer this data to the cloud. And therein lies the problem. As Deepak mentioned in the webinar, human beings have to yet conquer the limitations of physics :).  You have to have a pretty big pipe to the Internet to just transfer this data.

        Amazon has come up with various means to help with this issue. They are creating copies of publicly available data sets within their cloud so that customers don’t have to transfer them. They are also working with companies to keep private data sets in the cloud for other customers to use. So similar to how you would be able to spin up a Redhat AMI, by paying some license fee to Redhat, I believe they are looking at providing customers access to this private data sets by paying some fee to the company providing this data set. It is a win-win-win situation 🙂 for Amazon, the company providing the private data set and Amazon’s web services customers. They also support a one time import of data from physical disk or tape.

        Coming back to the title of this post :). I think this field is still in it’s infancy. Once companies start migrating their infrastructure to the cloud (And yes, it will happen. It is only a matter of time :).), it will be a lot easier to leverage the cloud to perform your analytics. All your data will be in the cloud and you start leveraging the hardware and software stacks in the cloud.

        LinkedIn Network Map

        LinkedIn (professional networking site) is providing a way to map your networks to see where you have your strongest connections. Here is a map of my networks. You can click on the image to get to the live map.

        My strongest connections so far are at

        I wish they came up with a map showing the location of my network too. That way, I can find out if I can get a job in New Zealand through my network :).