Wednesday, December 5, 2012

Import selected rows into HBase from Export Backup

Taking full table backup of your HBase tables are quite natural, but when you want to import only selective rows out, it is not possible. So if your row keys have common substring, you can now selectively import the data from your Export output.

The above code was tested on Hadoop 1.0.1, HBase 0.92.1.

Monday, August 13, 2012

Physique - Marrying Physics and Social Computing

Consider a world, where in "living" beings aren't the only "living" entities anymore. Things considered impossible / immovable started freely communicating with each other. I guess right now "The Matrix" should have crossed your mind if you are by any chance a tech bluff like me.

That is exactly what I always wanted to see / build in my real life, not just in movies. Now at Yahoo! Open Hack 2012 (5th Edition at India), these guys provided a problem statement - under "Digital Communication"

Your challenge is to build a product or feature that solves a problem in the digital communications space. Think about how you can add to the existing communications landscape. Your solution may be mass market, or specific to a particular segment of the marketplace, like development teams for example. Where appropriate, use Yahoo! Mail or Messenger APIs to add value to your hack.  
Source - http://openhackindia2012.hacker.yahoo.net/#discussions/49989

And reading, a lot of articles (including this on Techcrunch published < 24 hours ago, during the hackathon), news, and personally watching lot of changes and trends in the market myself I am completely convinced that I there is lot of scope in "Digital Communication" space.

I checkout my Facebook and Twitter Feeds often, but when you are part of a gathering of this size, 1 in every 3 - 5 machines has either Twitter / Facebook open ALWAYS. I realized that Social Networking has just become so much part of our lives.

Thinking all these into my mind, allow me to introduce - "Physique". Something we were able to develop in ~30 hours, hoping to change the way "Machines" communicate and collaborate in the future, for Humans.

Physique is all about everyday things sharing their current state information with you in real time. Imagine if all the things you own had a digital identity (say, like a Facebook Profile) and shared their information continuously with you like your social media friends do. That is what Physique aims to be!

We call all physical objects as "Thngs" (yes intentionally) and provide a API to access them. Basically, you build apps on top of our platform for the objects to connect with each other and enable collaboration among them. 

Some high-level example Use-Cases (something which will be reality in < 50 years from now)

  1. Your fridge can order Milk cartons for you, when you are running out of milk.
  2. Your Television will record certain segments of the telecast, when you get a call and you ought to attend it.
  3. Wine glasses in a party can help bring like-minded people together, same wine glasses can also pre-order condoms for you if there need be.

    ... Etc. to name a few
Well, I think you get the picture by now, regarding what I said by "communication + collaboration" of machines. In order for all these things to happen, there has to be some established standards and protocols, for the devices to communicate and make things happen. 

We have a basic version of the hack up and running at http://labs.ashwanthkumar.in/physique/ (we are still working on the API Documentation + SDKs for developers). 

We will come back to you with more updates and features in the up coming weeks. In the meantime if you have any ideas, or if you are working on something similar we would love to connect with you :-) 

Sunday, July 22, 2012

Tweetoem - Discovering Art in Tweets

Title is slightly mis-leading, but I cannot think of a better way to put it. Let me tell you the story behind building of "Tweetoem" (Tweets + Poem). 

Just in case you stumbled on this post first, Tweetoem is live on http://labs.ashwanthkumar.in/tweetoem/

Tweetoem is highly inspired from 140verses (http://140verses.com/). One of my friends shared this link (he is one of the developer of 140verses) on his facebook timeline. It was love at first sight for me, when my reverse-engineering brain cells got activated. I looked into the tweet generating poems, I loved the idea, I loved the execution of work. It has been quite some time until I did any hacks (none after I started working, life of intern seems a bliss now). So to put my spirits high and to brush up on my PHP skills I wondered if I can put something like that in < 6 hours of work? And "Tweetoem" was the result of that work. 

Algorithm (working behind the scenes)
  1. Get the tweets from Twitter
  2. Strip user_mentions, hashtags, and links of the tweets
  3. Get the last word and reverse it
  4. Calculate the Metaphone value of that word
  5. Store the Tweet + Metaphone value 
I am not sure if this is exactly what is being used (from my observation, it seems to be something more sophisticated), but this seems to do the trick for me. I got a bootstrap template, wrote a couple of controller methods (limonade kicks ass here), and that's it. What you see is the outcome of that work in ~6 hours of hacking. 


Known Issues
  1. System is relatively new, so you might not get poems at many chances
  2. Sometimes the lines might be repeated in the poem
  3. Occasionally search breaks for no reason, little refresh or a new keyword should do the trick
  4. No Share features like in 140verses (intentionally not implemented)


Disclaimer - Tweetom idea was inspired from 140verses and the original developers are to be highly appreciated for the innovative thinking + work. Tweetoem was just a self ego satisfying bad hack to replicate the same in less than a day. It is by no means tries to compete with 140verses / their scope. 

Sunday, July 15, 2012

Regarding Placements -- My 2 cents to fellow final years

Well since many ppl ping me on Facebook asking me about Mu Sigma, I thought it might be good to put some real thoughts into writing this so that it might help some guys of my college, but you are free to read it through and share your views.

You should really know something about the company before you even think of attending the interview in the first place. You should have ENGINEER's PRIDE!

General RULE - You do not want to work in any company!

About the Company

Not all people who get placed in Mu Sigma write code (real code)! Most of them end up working with Excel / Powerpoint (most part of their time), Why? Because Mu Sigma is not an IT company, they solve business problems. Saying that, Mu Sigma is one of the wonderful places to work with. They are highly data-driven company who work closely with the data, and derive conclusions which is justified from the data. For a typical Business Analyst (default job offering in my batch), you will work for a client of theirs, solve a problem for the client using a well built and evolving custom framework (DIPP - http://www.mu-sigma.com/knowledge/knowledge-delivery-framework.html). You will get a lot of opportunities to talk to real clients / go on-site (I know a guy who went to On-Site within 5 months of joining, because he showed some real talent). Almost everyone is young (avg age of the company is ~23 years), but if you ask any 1+ year exp Mu Sigma employee they would tell, you really have to work your ASS out (not every single person but most of them). I was an intern for around 5 months there and I worked ~16 hours a day (out of my own interest though but they still had some interesting work for me!).

Interview Process

Now to say something about the interview process --
  1. First Round - Aptitude (you can answer them if you have your RS Agarwal book problems at finger tips) -- But since Mu Sigma, is a highly data driven company give more importance to Data based problems (last part of the paper). 
  2. Second Round - Group Discussion -- Some random current affairs topic, based on which you need to be bull shitting around for ~5 min, giving chances to everyone and making on-mute people also to un-mute themselves. If you do so, you can get in.
    Points to remember -- Be confident on your points, do not fight (it mostly ends up being a fish market), be polite yet fuck up with other person's happiness. 
  3. Third Round - Interview -- Mu Sigma generally asks puzzles. Lot of puzzles. They are data driven company (basically you need to know math, if you kind of a person who fret on seeing numbers (like me) Mu Sigma is not the place for you), so might be imposed with lot of data and you are also expected to know lot of data points (on the universe, like radius of a cricket ball, avg radius of a person skull / head etc.) as well. If you are lucky you might be blessed with the presence of Stats goddess (trust me she is not HOT! -- for non-math homosapiens), which fucks up with most of the people's happiness. 
  4. (Optional) Fourth Round -- HR Interview -- This round does not happen to all. If they are un-sure if they really want you or not, you might be called in for this round. There would be some people who would be selected at third round or rejected third round. 

These were the 4 rounds I went through and finally was offered. How I was offered was a completely different story and it does not happen to everyone. I had my reasons to join Mu Sigma (even though it was not an IT company and I was from CSE).

Background -- I got placed in Mu Sigma on campus, and I did my final semester Internship in Mu Sigma for 5 months. I loved the place and if given a chance I would really go back and work there for all practical reasons. I still share some affinity towards Mu Sigma, but I, in best of my mind, sincerely hope that has not been a factor to introduce any biased views of the Company or the process. Also this is something I experienced in first person and it is highly bound to change for YOU!

YOU HAVE BEEN WARNED!

My 2 Cents 

One thing that I realized when I was preping for Placements was it like standing on the queue to buy tickets for a movie's first show. No one knows what the company is (even if we can go and do some research, we just don't -- we are too busy editing a so-called resume / do what god knows in the name of solving Aptitude in group) and does going to such a company really had value to what we have studied? (Studying ECE / EEE for 4 long years, and they end up in TCS writing Java / .Net code, which is totally irrelevant, then Why the fuck did you do ECE / EEE in the first place? -- I want to shout this to every single student of my batch, my seniors and juniors).

Since most of us want a good college and does not give a shit to what we study during our college, there is really no use talking about it now at the end of 4 years. Still I would really recommend you to go and give it a thought about it. Do some SWOT Analysis (if you can't recognize this term, screw your HR Prof. or they must be screwing you, either should happen for sure), find your strengths and get placed in a company that best suits you interest, and do not join a company because your parents / siblings might be working here or some crack pot neighbor hood guy is working and bitching about the company.

At the end, it is your life. Make sure they are your choices and not anyone else's. All the very best for your placements and a wonderful Career, Cheers!!



P.S: This post if free, but it is not guaranteed
P.P.S: Everything written above is a first person experienced / learnt information. I not liable of any impersonation based on some above said characteristics of your highly vivid imaginative mind, they are meant to fictitious and they shall always remain to be so.
P.P.P.S: If you feel anything is offending, pl leave a comment, but don't SPAM my comment space asking me advice / starting a flame war. I don't have any time to help you with that, ask your HR Dept. they are paid to help you with all these, and most them do their job well. 

Friday, June 29, 2012

[INFO] Apache Wave First Look

It is been so long, so I thought I will give Apache Wave (http://incubator.apache.org/wave/) a try.

Steps to get up and running
$ git clone git://git.apache.org/wave.git wave
$ cd wave
$ ant compile-gwt dist-server
$ ant -f server-config.xml
$ ./run-server.sh

PS -- Watch out first ant command will take immense CPU, and some time.



One of the best things is most of the Google Wave plugins still work :-) #win

Tuesday, May 1, 2012

BlueIgnis - Feature Screencast - Closed Beta Preview

UPDATE on 2nd May, 2012 IST - BlueIgnis Closed Private Beta is closed. Thanks to everyone who tested it.

An year old dream, 100's of prototypes, more than10 months of designing, finally everything getting shape. BlueIgnis is finally launched for Closed Private Beta. Quite some ppl are giving it a spin right now. Just love to see the system handle the pressure and scale as needed. Only thing is I am hard pressed on, are the server resources and though the system is horizontally scalable, I wish I can scale along with it to support more concurrent users and remove the limitations on the system. Just ranting about the status quo on the system.

Good thing is that system is a bit well tested with real users and it is all set for the demo happening today in about 5 hours from now. Though with the help of Salai I have completed some more new features to the system, I am not going to push them in until after the demo.

I just made a small 6 minute screencast on features of BlueIgnis, and I thought I will put it forward for you. Though the system is up and running, I would really appreciate if you do not create more than 1 account per user. The system has a hard limit on 50 users and it will lock the registrations immediately after that.

Apart from that, I have imposed the limits on Number of Campaigns per user and Time period for which Campaigns will be running since it starts. I will add another screencast once I am complete with integration of the new set of features.

You can find the screencast on http://youtu.be/VYJGQreoYbg

PS: I totally understand if you feel the Screencast is not of the mark, I have never recorded by own voice before. I will improvise on that. 

Thursday, April 26, 2012

Realtime Feeds going Crazy

I was doing a screencast for my project and I noticed this *craziness* after moving to AWS Infrastructure. This is a real time screen recorded (no effects / changes made) video of my project's Realtime Twitter Dashboard.



Some things I learnt from this video -

  1. System I built is DAMN CRAZY! - Totally!
  2. My UI did not break even when the data was pouring in like a Flash Flood - #WIN
  3. Realtime is too fast for a human being to make sense out of it (like in the video at that pace makes no sense to me at all - It is a show off!) -- This suggests me to have a rate control check on the tweets pouring in, or improve the UX like in Twitter - "You have N new Tweets" message, and from what I see that N will be sky rocketing. Need to do something about it. 

Saturday, April 14, 2012

BlueIgnis In Making - Pre Alpha Preview

I wonder how many have you experienced what I had for the past 2 days. I had a phone call from my project co-ordinator stating that I might not get my degree. There has been a lot of mis-communication by the person involved in the issue. Shit Happens!! Now I have my 2nd and 3rd Review on Monday, I consider myself special, you know ;-)

I never really started to build something for my project until Thursday when I had the phone calls. I shifted all my engagements at work till Tuesday to get into the damn thing.

Scenario - I had 2 days for building a demo, 1 day for preparing the documentation, and 1 day for travel (which I generally do at night to save time).

To my own astonishment I was able to come with a demo-able something in 1 day, which I shared with some colleagues of mine to improvise on demand and fix some bugs. After around 1.5 days of work, here are some of the screen shots from the application.

Ingredients# - PHP (as always) + MySQL + Redis + Twitter Bootstrap + jQuery +
Coke + Pressure to get a Degree (*wink*)

# Prioritized on usage

First the Dashboard, it shows you the overall stats of the Twitter feeds and their sentiment division among Positive, Negative and Neutral. Currently I have implemented only Twitter feeds, Facebook and G+ should not take much time to implement.


Next is the Key Influencers. People who shape the conversation about your product / event / campaign. It is shown in a network form, where the center node is a picture that represents your campaign, and all other nodes represent the users. User's contribution towards the campaign denotes the higher size of the node. One of my friends suggested to improvise on this using Klout score too, seems like a very nice idea, need to see if it is possible and it would be a Kicking Ass feature if implemented.

BlueIgnis Key Influencers

Next comes the Realtime Twitter Dashboard. As we get in Tweets to the system, we push them to the screen so that users see what people are talking about their campaign in real time. I do not know UX or anything related to UI, but I am a freaking good user.

I know the problems user's might face and one such problem was. When I kept on adding tweets to the page in realtime, for a high frequency campaign. After around 10 - 15 min, my browser started becoming slow / non-responsive. Too much of content on page was not at all a good idea, hence now we start off with 10 tweets pre-loaded into the page and it gets updated in real time. Still there are no more than 100 tweets at the page (on either side) at any given point of time. Having the browser and page more responsive.

I am still open for any further improvements / changes by an expert. 


Next comes the Realtime Sentiment Dashboard  - Real time feed of sentiments for the campaign. As we process the tweets as Positive / Negative they drop by into the screen in real time.


Apart from this we have a login screen and a create campaign page which helps you create campaigns. Once I am done with this documentation I really wish to spend some considerable amount of time on this to add more features and analytics to it.

Would you be interested on using it? Do you think there are additional features that you might be interested in seeing in this? UX / UI tips for the novice here? Any suggestions are always welcome and drop by a comment stating what you feel about it. 

Monday, March 19, 2012

Fixing RHIPE socketConnection Error

Today I was working with RHIPE, testing another tool for working with R and Hadoop. When I installed the ProtoBuf 2.4.1 and RHIPE 0.66 and tried to run the test sample, I had the socketConnection problem while Rhipe library was trying to connect with the server that was started.

After some debugging I found out that, since I did not have localhost entry in my /etc/hosts but some custom mappings as per my env settings. Diving into R code I figured out that Line 25 points at "localhost" and all I had to do was to change it to 127.0.0.1 and re-install the Rhipe library.

I forked the project and modified it the code on https://github.com/ashwanthkumar/RHIPE. Need to give him a pull request after looking further into the code. 

Monday, March 12, 2012

When Psychology meets Web 2.0

Title sounds a bit strange doesn't it? Yeah! Today I was working on a paper for an international conference by my HOD. He wanted me to modify the paper on BlueIgnis and send it, but I was not interested. I wanted to something from scratch. It is an international conference and what else do you expect if its your first time?

I was moved to "Web-Based Learning: Innovation and Challenges", one of the available topics for the conference. I knew I had to do something in this. I am good at Web (it is supposed to be my area of expertise, self trumpet!), but I never worked on Web based Learning side of the web. Almost everything I knew was learnt by searching through google, but that does not constitute web based learning. I never really went through a full web based course, need to do that sometime soon.

On the other hand, I was desperate to do something on this topic and I finally remembered Nivi and her PhD research topic which has its roots on Contextual Learning on Smart Classrooms - How on the earth do you bring contextual learning to a classroom in India? That's what she does, you must really talk to her and you will know what I mean :-)

After talking to her for around 20 minutes and I have got everything I want for the paper. She introduced me to Lev Vygotsky and Howard Gardner. Her work is based on their work on Social Development Theory and Multiple Intelligence. She is using them on a real world classroom, while all I needed to do was try to apply those on a virtual classroom :-)

Bring about some technologies that tie them on the Web, and how we can marry them with today's social media burst (Vygotsky's theory) and how personalized (Gardner's theory mapped to web as it is now) the content on the web. All we need is a good POC to give it a go.

I did some googling to figure out there has been very less participation on these topics after 2001. I personally feel Facebook, Twitter, GitHub, etc. These services can definetly be tuned to provide both social and personalized learning experience to respective target audience.

Seems like not only News, e-Commerce or searching can be personalized after all. Welcome to the new era of #Personalization in (almost) everything.

Thursday, March 8, 2012

List of Hadoop Ecosystem Tools

Some time back there was a discussion on the Hadoop User mail list for the list of Hadoop ecosystem tools. I just thought I can put them together with a short description and links to their git repos or products page. If you find an error or feel I have missed out something let me know, I will update it.

Tools are in ascending order of their names.
  1. Ambari - Ambari is a monitoring, administration and lifecycle management project for Apache Hadoop™ clusters.
    Hadoop clusters require many inter-related components that must be installed, configured, and managed across the entire cluster. The set of components that are currently supported by Ambari includes: HBase, HCatalog, Hadoop HDFS, Hive, Hadoop MapReduce, Pig, Zookeeper. Visit their website for more information.
  2. Avro - Apache Avro is a data serialization system
    Avro provides:
    1. Rich data structures,
    2. A compact, fast
    3. binary data format,
    4. A container file, to store persistent data,
    5. Remote procedure call (RPC),
    6. Simple integration with dynamic languages.
      On the basis of working. It is similar to tools like Thrift or Protobuf but has its own edge as described on their documentation. Well basically to put it short it can be used for providing API like services that leverages Hadoop stack for performing some task.
  3. Bixo - Bixo is an open source web mining toolkit that runs as a series of Cascading pipes on top of Hadoop
    By building a customized Cascading pipe assembly, you can quickly create specialized web mining applications that are optimized for a particular use case. More information on their website.
  4. BookKeeper - BookKeeper is a system to reliably log streams of records
    It is designed to store write ahead logs, such as those found in database or database like applications. In fact, the Hadoop NameNode inspired BookKeeper. The NameNode logs changes to the in-memory namespace data structures to the local disk before they are applied in memory. However logging the changes locally means that if the NameNode fails the log will be inaccessible. We found that by using BookKeeper, the NameNode can log to distributed storage devices in a way that yields higher availability and performance. Although it was designed for the NameNode, BookKeeper can be used for any application that needs strong durability guarantees with high performance and has a single writer. More Info on their website.
  5. Cascading - Cascading is a Data Processing API, Process Planner, and Process Scheduler used for defining and executing complex, scale-free, and fault tolerant data processing workflows on an Apache Hadoop cluster. All without having to 'think' in MapReduce.
    Cascading is a thin Java library and API that sits on top of Hadoop's MapReduce layer and is executed from the command line like any other Hadoop application. Well more detailed documentation can be found on their website.
  6. Cascalog - Cascalog is a Clojure-based query language for Hadoop inspired by Datalog
    Cascalog is a fully-featured data processing and querying library for Clojure. The main use cases for Cascalog are processing "Big Data" on top of Hadoop or doing analysis on your local computer from the Clojure REPLCascalog is a replacement for tools like Pig, Hive, and Cascading.
    Cascalog operates at a significantly higher level of abstraction than a tool like SQL. More importantly, its tight integration with Clojure gives you the power to use abstraction and composition techniques with your data processing code just like you would with any other code. It's this latter point that sets Cascalog far above any other tool in terms of expressive power. General introduction here, and source code here.
  7. Chukwa - Chukwa is an open source data collection system for monitoring large distributed systems
    Chukwa is built on top of the Hadoop Distributed File System (HDFS) and Map/Reduce framework and inherits Hadoop’s scalability and robustness. Chukwa also includes a ?exible and powerful toolkit for displaying, monitoring and analyzing results to make the best use of the collected data. More information can be found here.
  8. Crunch - a Java library that aims to make writing, testing, and running MapReduce pipelines easy, efficient, and even fun. Crunch’s design is modeled after Google’s FlumeJava
    Crunch is a Java library for writing, testing, and running MapReduce pipelines, based on Google's FlumeJava. Its goal is to make pipelines that are composed of many user-defined functions simple to write, easy to test, and efficient to run. Excellent introduction here and Source code is here.
  9. Crux - Reporting tool built for HBase
    Crux is a reporting application for HBase, the Hadoop database. Crux helps to query and visualize data saved in HBase. General introduction can be found here and source code ishere.
  10. Elastic Map Reduce - web service that enables businesses, researchers, data analysts, and developers to easily and cost-effectively process vast amounts of data
    Amazon Elastic MapReduce utilizes a hosted Hadoop framework running on the web-scale infrastructure of Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Simple Storage Service (Amazon S3). More information can be found on the AWS EMR page.
  11. Flume - distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of log data
    Its main goal is to deliver data from applications to Hadoop’s HDFS. It has a simple and flexible architecture based on streaming data flows. It is robust and fault tolerant with tunable reliability mechanisms and many failover and recovery mechanisms. The system is centrally managed and allows for intelligent dynamic management. It uses a simple extensible data model that allows for online analytic applications. More information can be found on their wiki and source code is here.
  12. Hadoop common - Hadoop Common is a set of utilities that support the Hadoop subprojects
    Hadoop Common is a set of utilities that support the Hadoop subprojects. Hadoop Common includes FileSystem, RPC, and serialization libraries. More info can be gathered here.
  13. Hama - distributed computing framework based on BSP (Bulk Synchronous Parallel) computing techniques for massive scientific computations
    It was inspired by Google's Pregel, but different in the sense that it's purely BSP and common model, not just for graph. More information can be found here.
  14. HBase - distributed scalable Big Data store
    HBase is the Hadoop database. Think of it as a distributed scalable Big Data store. HBase can be used when you need random, realtime read/write access to your Big Data. There is extensive resource on HBase Book.
  15. HCatalog - table and storage management service for data created using Apache Hadoop
    Apache HCatalog includes Providing a shared schema and data type mechanism, Providing a table abstraction so that users need not be concerned with where or how their data is stored, and Providing interoperability across data processing tools such as Pig, Map Reduce, Streaming, and Hive. More information can be found on their website.
  16. HDFS - primary storage system used by Hadoop applications
    The Hadoop Distributed File System (HDFS) is a distributed file system designed to run on commodity hardware. It has many similarities with existing distributed file systems. However, the differences from other distributed file systems are significant. HDFS is highly fault-tolerant and is designed to be deployed on low-cost hardware. HDFS provides high throughput access to application data and is suitable for applications that have large data sets. HDFS relaxes a few POSIX requirements to enable streaming access to file system data. HDFS was originally built as infrastructure for the Apache Nutch web search engine project. More information can be found on their website.
  17. HIHO: Hadoop In, Hadoop Out - Hadoop Data Integration, deduplication, incremental update and more
    Hadoop Data Integration with various databases, ftp servers, salesforce. Incremental update, dedup, append, merge your data on Hadoop. More information can be found on theirwebsite.
  18. Hive - data warehouse system for Hadoop that facilitates easy data summarization, ad-hoc queries, and the analysis of large datasets stored in Hadoop compatible file systems
    Hive provides a mechanism to project structure onto this data and query the data using a SQL-like language called HiveQL. At the same time this language also allows traditional map/reduce programmers to plug in their custom mappers and reducers when it is inconvenient or inefficient to express this logic in HiveQL. More information can be found on theirwebsite.
  19. Hoop - provides access to all Hadoop Distributed File System (HDFS) operations (read and write) over HTTP/S
    Hoop server is a full rewrite of Hadoop HDFS Proxy. Although it is similar to Hadoop HDFS Proxy (runs in a servlet-container, provides a REST API, pluggable authentication and authorization), Hoop server improves many of Hadoop HDFS Proxy shortcomings. More information can be found on their website.
  20. HUE (Hadoop User Environment) - browser-based desktop interface for interacting with Hadoop
    Hue is both a web UI for Hadoop and a framework to create interactive web applications. It features a FileBrowser for accessing HDFS, JobSub and JobBrowser applications for submitting and viewing MapReduce jobs, a Beeswax application for interacting with Hive. On top of that, the web frontend is mostly built from declarative widgets that require no JavaScript and are easy to learn. More information can be found on their git repo.
  21. Jaql - Query Language for JavaScript(r) Object Notation (JSON)
    Jaql is a query language designed for Javascript Object Notation (JSON), a data format that has become popular because of its simplicity and modeling flexibility. Jaql is primarily used to analyze large-scale semi-structured data. Core features include user extensibility and parallelism. In addition to modeling semi-structured data, JSON simplifies extensibility. Hadoop's Map-Reduce is used for parallelism. More information can be found on their code base.
  22. Lily - Lily is Smart Data, at Scale, made Easy
    It is the first data repository built from the ground up to bring Big Data / NOSQL technology into the hands of the enterprise application architect. More detailed information on theirwebsite.
  23. Mahout - machine learning library's goal is to build scalable machine learning libraries
    Scalable to reasonably large data sets. Our core algorithms for clustering, classfication and batch based collaborative filtering are implemented on top of Apache Hadoop using the map/reduce paradigm. However we do not restrict contributions to Hadoop based implementations: Contributions that run on a single node or on a non-Hadoop cluster are welcome as well. The core libraries are highly optimized to allow for good performance also for non-distributed algorithms. More information can be found here.
  24. Map Reduce - MapReduce is a programming model and software framework for writing applications that rapidly process vast amounts of data in parallel on large clusters of compute nodes
    Apache MapReduce job usually splits the input data-set into independent chunks which are processed by the map tasks in a completely parallel manner. The framework sorts the outputs of the maps, which are then input to the reduce tasks. Typically both the input and the output of the job are stored in a file-system. The framework takes care of scheduling tasks, monitoring them and re-executes the failed tasks. More detail tutorial with examples to run on top of Hadoop can be found here while the general introduction to Map Reduce can be looked upon in Google University.
  25. Nutch - open source Web crawler written in Java
    You have all the features you can expect from a web crawler. Now Nutch can be integrated with Hadoop, this resource can help you setting that up. More details of Nutch can be found on their website.
  26. Oozie - workflow/coordination service to manage data processing jobs for Apache Hadoop
    Oozie is an extensible, scalable and data-aware service to orchestrate dependencies between jobs running on Hadoop (including HDFS, Pig and MapReduce). Oozie is a lot of things, but being: A workflow solution for off Hadoop processing and Another query processing API, a la Cascading is not one of them. More useful information can be found on here and also here.
  27. Pangool -   low-level MapReduce API that aims to be a replacement for the Hadoop Java MR API
    By implementing an intermediate Tuple-based schema and configuring a Job conveniently, many of the accidental complexities that arise from using the Hadoop Java MapReduce API disappear
  28. Pig - platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs
    The salient property of Pig programs is that their structure is amenable to substantial parallelization, which in turns enables them to handle very large data sets. At the present time, Pig's infrastructure layer consists of a compiler that produces sequences of Map-Reduce programs, for which large-scale parallel implementations already exist (e.g., the Hadoop subproject). More information can be found at Pig Home.
  29. PrestoDB - Presto is an open source distributed SQL query engine for running interactive analytic queries against data sources of all sizes ranging from gigabytes to petabytes. 
  30. Scalding - Scala API for Cascading
    Refere to Cascading for more information. More information of Scalding can be found here. Excellent tutorial can be found here to get face-to-face introduction to Scalding.
  31. Sqoop - tool designed for efficiently transferring bulk data between Apache Hadoop and structured datastores such as relational databases
    Wonderful documentation on Sqoop can be found on Cloudera, its creator. Also official website is here.
  32. Zookeeper - centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services
    ZooKeeper is a centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services. All of these kinds of services are used in some form or another by distributed applications. Each time they are implemented there is a lot of work that goes into fixing the bugs and race conditions that are inevitable. Because of the difficulty of implementing these kinds of services, applications initially usually skimp on them ,which make them brittle in the presence of change and difficult to manage. Even when done correctly, different implementations of these services lead to management complexity when the applications are deployed. More information can be found here.

Using AWS from Corporate Firewall

I know the title is somewhat pretty straight. I just wanted to share one of my learnings when I tried to AWS services from within Corporate Firewall.

Rule of the Thumb - Never add all the IPs to your ruleset on your firewall that Amazon releases on its forum.

Amazon Web Services (AWS) has a concept called "Elastic IPs". This allows you get a set of Static IPs and then use them with EC2 instances or VPC service. Try to create a bunch of IPs around 10 - 15 (depending on your purpose) and assign them manually to the instances as and when you create them.

Elastic IPs are not free, but the cost involved in holding them even when you don't use them is very less. So its worth the bargain, especially if you belong to an organization that harps on data security.

PS: I am no way related to Amazon, but I just love their services :-) 

Friday, March 2, 2012

Patching Hadoop to support RMR 1.2

Back in my work we were working on R and Hadoop using the RHadoop(RMR). The latest release of RMR v.1.2 (download) has quite a few interesting updates to it. See here for a complete overview.

One of our test Hadoop Cluster has 10+ nodes which runs on Hadoop 0.20-append version build specifically for HBase. When we upgraded our RMR package on the cluster with v1.2, we ran into multiple issues. This post is just a summary of my experience on how to patch the Hadoop 0.20.x versions to support RMR v1.2 right away. Hoping it might be helpful for others in the community who might encounter some problems.


  1. First and foremost thing is, when you upgrade the RMR version without patching your earlier Hadoop distribution then you are likely to end up in the error of org.apache.hadoop.contrib.streaming.AutoInputFormat. 
  2. So you can follow the instructions as specified in Dumdo's Wiki (https://github.com/klbostee/dumbo/wiki/Building-and-installing) to download the apply the required patches. 
  3. This should help you run your latest R codes on the Hadoop cluster. Still you can't use the "combine" parameter if you are using earlier version of hadoop-0.20.203. In which case you might also need HADOOP-4842 patch. 
These are the in general very broad steps involved in building the patched version of the Hadoop. 

When you try to manage over a large cluster (I am not talking about me), over 40,50 machines building on all the systems is a waste of time and you generally need to have your Hadoop stack brought down. So what I suggest you is, download a local copy of the Hadoop version you are using right now. 

You can either download them from the Hadoop releases or checkout from source code, assuming you already dont have a custom built version of Hadoop. 

Apply the patches on the local copy (you don't need to edit any configurations or change any parameters). Just apply the patches and build the Hadoop source. Once you apply these patches and build the source code you need to replace only one single JAR file in your production cluster, which is $HADOOP_HOME/contrib/streaming/hadoop*streaming*.jar. All the patches deal only with the Hadoop Streaming only. But do realize you need to build the JAR for your version. 

I just wrote a small ant build script that can aid in doing the above process. Which tries to do the above process in an automated way. 


PS: Though I have tested the code, still try this at your own risk. 

Wednesday, February 29, 2012

Twitter Streaming Limit Workaround


I was working on my final year project (BlueIgnis) which uses Streaming Twitter API. I had the following understandings from them (on free version):
  1. One Account can open only One Streaming Connection at any given time
  2. One IP may be associated with only One Account while streaming. Rotation of Streaming connections based on multiple accounts are not allowed. May lead to IP Ban. (All the more reason to use EC2 Instances for Streaming :P)
  3. One Streaming connection may allow upto 400 tracks (different keywords) to filter from.
  4. Reply to 402 Error codes with proper HTTP Status.
  5. Should use non-aggressive re-connect policies, must give substantial amount of time in-between subsequent requests.
  6. Periodically we must stop the Streaming Connection, add more tracks (keywords) to the list and re-start the connection, rather than individual connections for multiple times.
Based on these understandings, I came up with own Architecture for Twitter Streaming. Below diagram represents the overall architecture of my application with respect to Twitter Streaming Component.



Hosting the Twitter Streaming on an EC2 Instance, we can achieve 400 tracks (keywords) per node which can handle approx. 30 - 50 customers based on my use-case. I periodically (~10 min) check if there are any new tracks that needs to be added to the node until it becomes 400. Since I need to know which user requested the track, which is not possible to get from the current way the Streaming API works. 

So I decided to build a Local Firehouse, where in I stack all the tweets for all the tracks, all in a single location. Then, I use a FullText Search feature of MySQL (my datastore) to search for the related tweets continously so that I can achieve the feel of a bit delayed streaming yet close-to-realtime processing.

If you have any better ways to get things done, please let me know.

Saturday, February 25, 2012

Hadoop AutoKill Hack

February's Hack! As you might have already known that I am working on Big Data, and by De facto we all use Hadoop eco-system to get things done. On the same page, I was just looking into some Hadoop Java API the other day to see how well I can get to see somethings happening under the hood.

More specifically I was trying to use JobClient class to see if I can build some custom client or an interface to the Hadoop Jobs we run on our cluster. During which I thought, can I add custom Job Timeout feature to Hadoop.

Problem Statement: I want to kill any job that runs beyond T time units in Hadoop and how do I do it? 

So I started writing custom client which can interact with the JobTracker to get the list of running jobs, and how long they have been running. If they exceed my given threshold time limit I would want to kill them. That is the overall concept, and I guess what I built it. API is so simple and straight forward, all it took was less than an hour to look into the jobdetails.jsp and see how to access the Jobs from the JobTracker and display the start time.

However the tricky thing was how to run the damn thing. I always got the "IOException: Broken Pipe" error. Then finally got the way we need to access it, was through running it as

$ hadoop jar JARName.jar

So, yeah I wrote a small hack for this. You can find it on my Git (https://github.com/ashwanthkumar/hadoop-autokill).

Thursday, February 9, 2012

Introducing Scraphp - Web Crawler in PHP


Scraphp (say Scraph, last p is silent) is a web crawling program. It is basically built to be a standalone executable which can crawl websites and store extract useful content out of it. I created this script for a challenge posted by Indix on Jan 2012, where in I was asked to crawl AGMarket (http://agmarknet.nic.in/) to get the prices of all the products, and store their prices. I also had to version the prices such that it should persist across dates.
Scraph was inspired from a similar project called Scrappy, written in Python. This is not an attempt to port it, but just wanted to see how much similar properties can I build from it in less than a day.
One of the major features I would like to call it is, When you crawl the page you can extract entites out of it based on XPath. So basically when we crawl a page I create a bean whose properties are set of values got by applying the given XPath on the page. Each XPath is completely independent of the other. Currently Scraph supports creating only 1 type of object per page.
Hack into the source code, its well commented and easy to modify as per requirement. All the details of the crawling page, XPath queries are all provided in the configuration.php or you can supply your own config file, see the Usage. 
Code is available on my Git Repo - https://github.com/ashwanthkumar/scraphp 
I have tired my best to document the entire code well, and if you feel like any improvements can be made or you have got any suggestions? Please do not hesitate to fork and send me a pull request. 

Wednesday, February 8, 2012

Consuming CommonSense Knowledge on MR

ConceptNet5 one of my all time favorite dataset available out there. I am working with it in more detail for BlueIgnis (more details on this later). After coming into Big Data of Mu Sigma, this thought has been lingering in my mind. CN5 is really a large dataset ~24 GB of exported JSON data, goes upto ~111GB with indices (as explained on the link), y not use MapReduce to spice things up a bit?

When I joined the company last month, I was told to start with R (Statistical Language). I always wanted to port the Divisi2 to Java or PHP so that I can hack into it more. After a day of getting to know R, I wrote a simple wrapper in R to build CommonSense Matrix. Not the entire thing, just a sample of it with imaginative data and made it work (R code here).

Well basically its all doing SVD and operating on its components - U, V and E (Sigma), to get make predictions. Blah.. blah.. you could have read it in the page in detail if you can understand math (unlike me).

What am I trying to do here?
What I was wondering is, Number of Concepts (nodes) in CN5 exceeds way more than what I can imagine (I am yet to count them as I still have 32-bit system with me, and MongoDB can't hold more than 2GB of data on Win32 systems. Sigh!), not to mention the relations of each concept the same with the columns of the matrix. If only I could transfer data from Mongo to HBase and use Mahout's SVD Implemenation to build the required matrice and store it in HBase (again). I guess that should put me to use commonsense dataset based processing of data. I need to process realtime Tweets and FB Posts in Storm for BlueIgnis, would it match the performance on Real-time basis? Is this even possible? I don't have answers to these and many related questions yet. Just an idea, yet to hack into it more.

Let me know if you have already implemented this or working on similar road.

Updates:
Some interesting thought on GraphLab usage and performance over Mahout's implementation, here. (See the comments).

PS: Above idea was thought over a cup of tea and some cake at hand with no work to do. If you have already got anything like this, I would love to hear from you. 

I am Alive!

Been like ages since I wrote a post here. Lot of interesting things been going on around my life. Will try to keep this space updated in the forth coming weeks.

Joined in Mu Sigma as Intern last month and got into their Innovation and Development (generally R&D) department with Big Data. Work so far has been excellent, Playing with clusters, learning new languages, and some boring math, but still its all part of the game isn't?

PS: This is just an update post, to denote I am still alive and not dead. 

Friday, January 6, 2012

Review on Poi Solla Theriyathu - Tamil Short Film

Title should have told you all about it. No technical stuff this time. This post is a review on a short film made by some of my classmates.

Please watch the movie before you read on.



Review of Poi Solla Theriyathu


Poi Solla Theriyathu is a nice tea break film you can watch to laugh your ass out. A very professional and nice attempt made by Parvath, after his success in 11:fifty as an actor. There is really no real cons in the movie at all (except some silly ones).

Only place where Parvath slided was on the Focusing of the Camera, especially while introducing Swati and during the chase of Mahesh.

Apart from that, I was not able to find anything wrong with the execution of the movie at all. The final part of the movie which recaps some moments of shooting was a superb addition to the movie.

You can follow Terror Boys Film Crew on Facebook.

Overall rating of the film: 8/10

"Ennaku Poi Solla Theriyathunga"

Monday, January 2, 2012

NSE Valid Symbols

List of NSE Symbols for reference for NSE Live Stock API (http://live-nse.herokuapp.com/?symbol=INFY)
  1. 20MICRONS
  2. 3IINFOTECH
  3. 3MINDIA
  4. A2ZMES
  5. AANJANEYA
  6. AARTIDRUGS
  7. AARTIIND
  8. AARVEEDEN
  9. ABAN
  10. ABB
  11. ABBOTINDIA
  12. ABCIL
  13. ABGSHIP
  14. ABHISHEK
  15. ABIRLANUVO
  16. ACC
  17. ACE
  18. ACROPETAL
  19. ADANIENT
  20. ADANIPOWER
  21. ADFFOODS
  22. ADHUNIK
  23. ADORWELD
  24. ADSL
  25. ADVANIHOTR
  26. ADVANTA
  27. AEGISCHEM
  28. AFL
  29. AFTEK
  30. AGCNET
  31. AGRE
  32. AGRODUTCH
  33. AHLEAST
  34. AHLUCONT
  35. AHLWEST
  36. AHMEDFORGE
  37. AIAENG
  38. AJANTPHARM
  39. AJMERA
  40. AKSHOPTFBR
  41. AKZOINDIA
  42. ALBK
  43. ALCHEM
  44. ALEMBICLTD
  45. ALFALAVAL
  46. ALICON
  47. ALKALI
  48. ALKYLAMINE
  49. ALLCARGO
  50. ALLSEC
  51. ALMONDZ
  52. ALOKTEXT
  53. ALPA
  54. ALPHAGEO
  55. ALPINEHOU
  56. ALPSINDUS
  57. AMAR
  58. AMARAJABAT
  59. AMARJOTHI
  60. AMBIKCO
  61. AMBUJACEM
  62. AMDIND
  63. AMLSTEEL
  64. AMRUTANJAN
  65. AMTEKAUTO
  66. AMTEKINDIA
  67. ANANTRAJ
  68. ANDHRABANK
  69. ANDHRSUGAR
  70. ANGIND
  71. ANIKINDS
  72. ANKURDRUGS
  73. ANSALAPI
  74. ANSALHSG
  75. ANTGRAPHIC
  76. APARINDS
  77. APCOTEXIND
  78. APIL
  79. APLAB
  80. APLAPOLLO
  81. APLLTD
  82. APOLLOHOSP
  83. APOLLOTYRE
  84. APPAPER
  85. APTECHT
  86. AQUA
  87. ARCHIES
  88. AREVAT&D
  89. ARIES
  90. ARIHANT
  91. ARL
  92. AROGRANITE
  93. ARROWTEX
  94. ARSHIYA
  95. ARSSINFRA
  96. ARVIND
  97. ASAHIINDIA
  98. ASAHISONG
  99. ASAL
  100. ASHAPURMIN
  101. ASHIANA
  102. ASHIMASYN
  103. ASHOKA
  104. ASHOKLEY
  105. ASIANELEC
  106. ASIANHOTNR
  107. ASIANPAINT
  108. ASIANTILES
  109. ASIL
  110. ASSAMCO
  111. ASTEC
  112. ASTERSILI
  113. ASTRAL
  114. ASTRAMICRO
  115. ASTRAZEN
  116. ATFL
  117. ATLANTA
  118. ATLASCYCLE
  119. ATNINTER
  120. ATUL
  121. AURIONPRO
  122. AUROPHARMA
  123. AUSOMENT
  124. AUSTRAL
  125. AUTOAXLES
  126. AUTOIND
  127. AUTOLITIND
  128. AVENTIS
  129. AVTNPL
  130. AXIS-IT&T
  131. AXISBANK
  132. BAGFILMS
  133. BAJAJ-AUTO
  134. BAJAJCORP
  135. BAJAJELEC
  136. BAJAJFINSV
  137. BAJAJHIND
  138. BAJAJHLDNG
  139. BAJFINANCE
  140. BALAJITELE
  141. BALAMINES
  142. BALKRISIND
  143. BALLARPUR
  144. BALMLAWRIE
  145. BALPHARMA
  146. BALRAMCHIN
  147. BANARBEADS
  148. BANARISUG
  149. BANCOINDIA
  150. BANG
  151. BANKBARODA
  152. BANKINDIA
  153. BANSWRAS
  154. BARTRONICS
  155. BASF
  156. BASML
  157. BATAINDIA
  158. BATLIBOI
  159. BAYERCROP
  160. BBL
  161. BBTC
  162. BEARDSELL
  163. BECREL
  164. BEDMUTHA
  165. BEL
  166. BELLCERATL
  167. BEML
  168. BEPL
  169. BERGEPAINT
  170. BFINVEST
  171. BFUTILITIE
  172. BGLOBAL
  173. BGRENERGY
  174. BHAGWATIHO
  175. BHAGYNAGAR
  176. BHARATFORG
  177. BHARATGEAR
  178. BHARATRAS
  179. BHARTIARTL
  180. BHARTISHIP
  181. BHEL
  182. BHUSANSTL
  183. BIL
  184. BILENERGY
  185. BILPOWER
  186. BIMETAL
  187. BINANIIND
  188. BINDALAGRO
  189. BIOCON
  190. BIRLACORPN
  191. BIRLACOT
  192. BIRLAERIC
  193. BIRLAMONEY
  194. BIRLAPOWER
  195. BLBLIMITED
  196. BLISSGVS
  197. BLKASHYAP
  198. BLUECHIP
  199. BLUECOAST
  200. BLUEDART
  201. BLUESTARCO
  202. BLUESTINFO
  203. BOC
  204. BODALCHEM
  205. BOMDYEING
  206. BOSCHLTD
  207. BPCL
  208. BPL
  209. BRANDHOUSE
  210. BRFL
  211. BRIGADE
  212. BRITANNIA
  213. BROADCAST
  214. BROOKS
  215. BSELINFRA
  216. BSL
  217. BSTRANSCOM
  218. BURNPUR
  219. BVCL
  220. CADILAHC
  221. CAIRN
  222. CALSOFT
  223. CAMBRIDGE
  224. CAMLIN
  225. CANBK
  226. CANDC
  227. CANFINHOME
  228. CANTABIL
  229. CARBORUNIV
  230. CAREERP
  231. CAROLINFO
  232. CASTROL
  233. CCCL
  234. CCL
  235. CEATLTD
  236. CEBBCO
  237. CELEBRITY
  238. CELESTIAL
  239. CENTENKA
  240. CENTEXT
  241. CENTRALBK
  242. CENTUM
  243. CENTURYPLY
  244. CENTURYTEX
  245. CERA
  246. CESC
  247. CHAMBLFERT
  248. CHEMFALKAL
  249. CHEMPLAST
  250. CHENNPETRO
  251. CHESLINTEX
  252. CHETTINAD
  253. CHOLAFIN
  254. CILNOVA
  255. CIMMCO
  256. CINEMAX
  257. CINEVISTA
  258. CIPLA
  259. CLASSIC
  260. CLNINDIA
  261. CLUTCHAUTO
  262. CMAHENDRA
  263. CMC
  264. COALINDIA
  265. COLPAL
  266. COMPUAGE
  267. CONCOR
  268. CONSOFINVT
  269. CORAL-HUB
  270. CORDSCABLE
  271. COREEDUTEC
  272. COROENGG
  273. COROMANDEL
  274. CORPBANK
  275. COSMOFILMS
  276. COX&KINGS
  277. CREATIVEYE
  278. CRESTANI
  279. CREWBOS
  280. CRISIL
  281. CROMPGREAV
  282. CRONIMET
  283. CTE
  284. CUB
  285. CUBEXTUB
  286. CUMMINSIND
  287. CURATECH
  288. CYBERMEDIA
  289. CYBERTECH
  290. DAAWAT
  291. DABUR
  292. DALMIABEL
  293. DALMIASUG
  294. DATAMATICS
  295. DBCORP
  296. DBREALTY
  297. DCB
  298. DCHL
  299. DCM
  300. DCMSRMCONS
  301. DCW
  302. DECCANCE
  303. DEEPAKFERT
  304. DEEPAKNTR
  305. DEEPIND
  306. DELTACORP
  307. DEN
  308. DENABANK
  309. DENORA
  310. DENSO
  311. DEWANHOUS
  312. DHAMPURSUG
  313. DHANBANK
  314. DHANUKA
  315. DHARSUGAR
  316. DHUNINV
  317. DIAPOWER
  318. DICIND
  319. DIGJAM
  320. DISHMAN
  321. DISHTV
  322. DIVISLAB
  323. DLF
  324. DLINKINDIA
  325. DOLPHINOFF
  326. DONEAR
  327. DPSCLTD
  328. DPTL
  329. DQE
  330. DREDGECORP
  331. DRREDDY
  332. DSKULKARNI
  333. DSSL
  334. DUNCANSIND
  335. DWARKESH
  336. DYNAMATECH
  337. EASTSILK
  338. EASUNREYRL
  339. ECEIND
  340. ECLERX
  341. EDELWEISS
  342. EDL
  343. EDSERV
  344. EDUCOMP
  345. EICHERMOT
  346. EIDPARRY
  347. EIHAHOTELS
  348. EIHOTEL
  349. EIMCOELECO
  350. EKC
  351. ELDERPHARM
  352. ELECON
  353. ELECTCAST
  354. ELECTHERM
  355. ELFORGE
  356. ELGIEQUIP
  357. ELGIRUBCO
  358. ELNET
  359. EMAMIINFRA
  360. EMAMILTD
  361. EMCO
  362. EMKAY
  363. EMMBI
  364. EMPEESUG
  365. ENERGYDEV
  366. ENGINERSIN
  367. ENIL
  368. ENTEGRA
  369. EONELECT
  370. ERAINFRA
  371. EROSMEDIA
  372. ESABINDIA
  373. ESCORTS
  374. ESL
  375. ESSAROIL
  376. ESSARPORTS
  377. ESSARSHPNG
  378. ESSDEE
  379. ESSELPACK
  380. ESTER
  381. EUROCERA
  382. EUROMULTI
  383. EUROTEXIND
  384. EVEREADY
  385. EVERESTIND
  386. EVERONN
  387. EXCELCROP
  388. EXCELINDUS
  389. EXCELINFO
  390. EXIDEIND
  391. FACT
  392. FAGBEARING
  393. FAME
  394. FARMAXIND
  395. FCH
  396. FCSSOFT
  397. FDC
  398. FEDDERLOYD
  399. FEDERALBNK
  400. FIEMIND
  401. FILATEX
  402. FINANTECH
  403. FINCABLES
  404. FINPIPE
  405. FIRSTLEASE
  406. FIRSTWIN
  407. FKONCO
  408. FLEXITUFF
  409. FMGOETZE
  410. FORTIS
  411. FOSECOIND
  412. FOURSOFT
  413. FSL
  414. FTCPOF5YDV
  415. FTCPOF5YGR
  416. FUTUREVENT
  417. GABRIEL
  418. GAEL
  419. GAIL
  420. GAL
  421. GALLANTT
  422. GALLISPAT
  423. GAMMNINFRA
  424. GAMMONIND
  425. GANDHITUBE
  426. GANESHHOUC
  427. GARDENSILK
  428. GARWALLROP
  429. GATI
  430. GAYAPROJ
  431. GDL
  432. GEECEE
  433. GEINDSYS
  434. GEMINI
  435. GENESYS
  436. GENUSPOWER
  437. GEODESIC
  438. GEOJITBNPP
  439. GEOMETRIC
  440. GESHIP
  441. GHCL
  442. GICHSGFIN
  443. GILLANDERS
  444. GILLETTE
  445. GINNIFILA
  446. GIPCL
  447. GISOLUTION
  448. GITANJALI
  449. GKB
  450. GKWLIMITED
  451. GLAXO
  452. GLENMARK
  453. GLOBALVECT
  454. GLOBOFFS
  455. GLOBUSSPR
  456. GLODYNE
  457. GLORY
  458. GMBREW
  459. GMDCLTD
  460. GMRINFRA
  461. GNFC
  462. GOACARBON
  463. GODFRYPHLP
  464. GODREJCP
  465. GODREJIND
  466. GODREJPROP
  467. GOENKA
  468. GOKEX
  469. GOKUL
  470. GOLDENTOBC
  471. GOLDIAM
  472. GOLDINFRA
  473. GOLDTECH
  474. GPIL
  475. GPPL
  476. GRABALALK
  477. GRANULES
  478. GRAPHITE
  479. GRASIM
  480. GRAVITA
  481. GREAVESCOT
  482. GREENPLY
  483. GREENPOWER
  484. GRINDWELL
  485. GRUH
  486. GSFC
  487. GSKCONS
  488. GSLNOVA
  489. GSPL
  490. GSS
  491. GTL
  492. GTLINFRA
  493. GTNIND
  494. GTNTEX
  495. GTOFFSHORE
  496. GUFICBIO
  497. GUJALKALI
  498. GUJAPOLLO
  499. GUJFLUORO
  500. GUJNRECOKE
  501. GUJNREDVR
  502. GUJRATGAS
  503. GUJSIDHCEM
  504. GULFOILCOR
  505. GVKPIL
  506. HALONIX
  507. HANUNG
  508. HARRMALAYA
  509. HATHWAY
  510. HAVELLS
  511. HBLPOWER
  512. HBSTOCK
  513. HCC
  514. HCIL
  515. HCL-INSYS
  516. HCLTECH
  517. HDFC
  518. HDFCBANK
  519. HDIL
  520. HEG
  521. HEIDELBERG
  522. HELIOSMATH
  523. HERCULES
  524. HERITGFOOD
  525. HEROMOTOCO
  526. HEXAWARE
  527. HFCL
  528. HGS
  529. HIKAL
  530. HILTON
  531. HIMATSEIDE
  532. HINDALCO
  533. HINDCOMPOS
  534. HINDCOPPER
  535. HINDDORROL
  536. HINDMOTORS
  537. HINDNATGLS
  538. HINDOILEXP
  539. HINDPETRO
  540. HINDUJAFO
  541. HINDUJAVEN
  542. HINDUNILVR
  543. HINDZINC
  544. HIRAFERRO
  545. HIRECT
  546. HITACHIHOM
  547. HITECHGEAR
  548. HITECHPLAS
  549. HMT
  550. HMVL
  551. HOCL
  552. HONAUT
  553. HONDAPOWER
  554. HOPFL
  555. HORIZONINF
  556. HOTELEELA
  557. HOTELRUGBY
  558. HOVS
  559. HSIL
  560. HTMEDIA
  561. HUBTOWN
  562. HYDRBADIND
  563. HYDROS&S
  564. IBPOW
  565. IBREALEST
  566. IBSEC
  567. IBWSL
  568. ICICIBANK
  569. ICIL
  570. ICRA
  571. ICSA
  572. IDBI
  573. IDEA
  574. IDFC
  575. IFBAGRO
  576. IFBIND
  577. IFCI
  578. IFGLREFRAC
  579. IGARASHI
  580. IGL
  581. IGPL
  582. IL&FSENGG
  583. IL&FSTRANS
  584. IMFA
  585. IMPAL
  586. IMPEXFERRO
  587. INDBANK
  588. INDHOTEL
  589. INDIABULLS
  590. INDIACEM
  591. INDIAGLYCO
  592. INDIAINFO
  593. INDIANB
  594. INDIANCARD
  595. INDIANHUME
  596. INDLMETER
  597. INDNIPPON
  598. INDOCO
  599. INDORAMA
  600. INDOSOLAR
  601. INDOTECH
  602. INDOTHAI
  603. INDOWIND
  604. INDRAMEDCO
  605. INDSWFTLAB
  606. INDSWFTLTD
  607. INDTERRAIN
  608. INDUSFILA
  609. INDUSINDBK
  610. INEABS
  611. INFINITE
  612. INFODRIVE
  613. INFOMEDIA
  614. INFOTECENT
  615. INFY
  616. INGERRAND
  617. INGVYSYABK
  618. INNOIND
  619. INOXLEISUR
  620. INSECTICID
  621. INVENTURE
  622. IOB
  623. IOC
  624. IOLCP
  625. IOLN
  626. IPCALAB
  627. IPRINGLTD
  628. IRB
  629. ISFT
  630. ISMTLTD
  631. ITC
  632. ITDCEM
  633. ITI
  634. IVC
  635. IVP
  636. IVRCLAH
  637. IVRCLINFRA
  638. J&KBANK
  639. JAGRAN
  640. JAGSNPHARM
  641. JAIBALAJI
  642. JAICORPLTD
  643. JAINSTUDIO
  644. JAMNAAUTO
  645. JAYAGROGN
  646. JAYBARMARU
  647. JAYNECOIND
  648. JAYSREETEA
  649. JBCHEPHARM
  650. JBFIND
  651. JBMA
  652. JCTEL
  653. JDORGOCHEM
  654. JENSONICOL
  655. JETAIRWAYS
  656. JHS
  657. JIKIND
  658. JINDALPHOT
  659. JINDALPOLY
  660. JINDALSAW
  661. JINDALSTEL
  662. JINDALSWHL
  663. JINDCOT
  664. JINDRILL
  665. JINDWORLD
  666. JISLDVREQS
  667. JISLJALEQS
  668. JKCEMENT
  669. JKIL
  670. JKLAKSHMI
  671. JKPAPER
  672. JKTYRE
  673. JMCPROJECT
  674. JMFINANCIL
  675. JMTAUTOLTD
  676. JOCIL
  677. JPASSOCIAT
  678. JPINFRATEC
  679. JPPOWER
  680. JSL
  681. JSWENERGY
  682. JSWISPAT
  683. JSWSTEEL
  684. JUBILANT
  685. JUBLFOOD
  686. JUBLINDS
  687. JUMBO
  688. JUPITER
  689. JVLAGRO
  690. JYOTHYLAB
  691. JYOTISTRUC
  692. KABRAEXTRU
  693. KAJARIACER
  694. KAKATCEM
  695. KALECONSUL
  696. KALINDEE
  697. KALPATPOWR
  698. KAMATHOTEL
  699. KANANIIND
  700. KANDAGIRI
  701. KANORICHEM
  702. KANSAINER
  703. KARURKCP
  704. KARURVYSYA
  705. KAUSHALYA
  706. KAVVERITEL
  707. KBIL
  708. KCP
  709. KCPSUGIND
  710. KEC
  711. KECL
  712. KEI
  713. KEMROCK
  714. KERNEX
  715. KESARENT
  716. KESORAMIND
  717. KEYCORPSER
  718. KFA
  719. KGL
  720. KHAITANELE
  721. KHAITANLTD
  722. KHANDSE
  723. KICL
  724. KIL
  725. KILITCH
  726. KINETICMOT
  727. KIRIINDUS
  728. KIRLOSBROS
  729. KIRLOSENG
  730. KIRLOSIND
  731. KITPLYIND
  732. KKCL
  733. KLGSYSTEL
  734. KLRF
  735. KMSUGAR
  736. KNRCON
  737. KOHINOOR
  738. KOLTEPATIL
  739. KOPRAN
  740. KOTAKBANK
  741. KOTARISUG
  742. KOTHARIPET
  743. KOTHARIPRO
  744. KOUTONS
  745. KPIT
  746. KPRMILL
  747. KRBL
  748. KRISHNAENG
  749. KRITIIND
  750. KSBPUMPS
  751. KSCL
  752. KSE
  753. KSERASERA
  754. KSK
  755. KSL
  756. KSOILS
  757. KTIL
  758. KTKBANK
  759. KWALITY
  760. L&TFH
  761. LAKPRE
  762. LAKSHMIEFL
  763. LAKSHMIMIL
  764. LAKSHVILAS
  765. LANCOIN
  766. LAOPALA
  767. LAXMIMACH
  768. LCCINFOTEC
  769. LGBBROSLTD
  770. LGBFORGE
  771. LIBERTSHOE
  772. LICHSGFIN
  773. LITL
  774. LLOYDELENG
  775. LLOYDFIN
  776. LLOYDSTEEL
  777. LML
  778. LOGIXMICRO
  779. LOKESHMACH
  780. LOTUSEYE
  781. LOVABLE
  782. LPDC
  783. LT
  784. LUMAXAUTO
  785. LUMAXIND
  786. LUMAXTECH
  787. LUPIN
  788. LYKALABS
  789. M&M
  790. M&MFIN
  791. MAANALU
  792. MADHAV
  793. MADHUCON
  794. MADRASCEM
  795. MADRASFERT
  796. MAGMA
  797. MAGNUM
  798. MAHABANK
  799. MAHINDFORG
  800. MAHINDUGIN
  801. MAHLIFE
  802. MAHSCOOTER
  803. MAHSEAMLES
  804. MAITHANALL
  805. MALUPAPER
  806. MALWACOTT
  807. MANAKSIA
  808. MANALIPETC
  809. MANAPPURAM
  810. MANDHANA
  811. MANGALAM
  812. MANGCHEFER
  813. MANGLMCEM
  814. MANGTIMBER
  815. MANINDS
  816. MANINFRA
  817. MANJEERA
  818. MANJUSHREE
  819. MANUGRAPH
  820. MARALOVER
  821. MARG
  822. MARICO
  823. MARKSANS
  824. MARUTI
  825. MASTEK
  826. MAWANASUG
  827. MAX
  828. MAXWELL
  829. MBECL
  830. MBLINFRA
  831. MBSWITCH
  832. MCDHOLDING
  833. MCDOWELL-N
  834. MCLEODRUSS
  835. MEGASOFT
  836. MEGH
  837. MELSTAR
  838. MERCATOR
  839. MERCK
  840. MHRIL
  841. MIC
  842. MICROSEC
  843. MICROTECH
  844. MINDAIND
  845. MINDTREE
  846. MIRCELECTR
  847. MIRZAINT
  848. MMFL
  849. MMFSL
  850. MMTC
  851. MOIL
  852. MONNETISPA
  853. MONSANTO
  854. MORARJETEX
  855. MOREPENLAB
  856. MOSERBAER
  857. MOTHERSUMI
  858. MOTILALOFS
  859. MOTOGENFIN
  860. MPHASIS
  861. MPSLTD
  862. MRF
  863. MRO-TEK
  864. MRPL
  865. MSPL
  866. MTNL
  867. MUDRA
  868. MUKANDENGG
  869. MUKANDLTD
  870. MUKTAARTS
  871. MUNDRAPORT
  872. MUNJALAU
  873. MUNJALSHOW
  874. MURUDCERA
  875. MUTHOOTFIN
  876. MVL
  877. MVLIND
  878. MYSOREBANK
  879. NAGREEKCAP
  880. NAGREEKEXP
  881. NAHARCAP
  882. NAHARINDUS
  883. NAHARPOLY
  884. NAHARSPING
  885. NANDAN
  886. NATCOPHARM
  887. NATHSEED
  888. NATIONALUM
  889. NATNLSTEEL
  890. NAUKRI
  891. NAVINFLUOR
  892. NAVNETPUBL
  893. NBVENTURES
  894. NCC
  895. NCLIND
  896. NCOPPER
  897. NDTV
  898. NECLIFE
  899. NEHAINT
  900. NELCAST
  901. NELCO
  902. NEPCMICON
  903. NESCO
  904. NESTLEIND
  905. NET4
  906. NETWORK18
  907. NEULANDLAB
  908. NEXTMEDIA
  909. NEYVELILIG
  910. NFL
  911. NHPC
  912. NICCO
  913. NIITLTD
  914. NIITTECH
  915. NILKAMAL
  916. NIPPOBATRY
  917. NITCO
  918. NITESHEST
  919. NITINFIRE
  920. NITINSPIN
  921. NMDC
  922. NOCIL
  923. NOIDATOLL
  924. NOL
  925. NORBTEAEXP
  926. NORTHGATE
  927. NOVOPANIND
  928. NRBBEARING
  929. NRC
  930. NSIL
  931. NTPC
  932. NUCENT
  933. NUCHEM
  934. NUCLEUS
  935. NUMERICPW
  936. NUTEK
  937. OBEROIRLTY
  938. OCL
  939. OFSS
  940. OIL
  941. OILCOUNTUB
  942. OISL
  943. OMAXAUTO
  944. OMAXE
  945. OMKARCHEM
  946. OMMETALS
  947. OMNITECH
  948. ONELIFECAP
  949. ONGC
  950. ONMOBILE
  951. ONWARDTEC
  952. OPTOCIRCUI
  953. ORBITCORP
  954. ORCHIDCHEM
  955. ORIENTABRA
  956. ORIENTALTL
  957. ORIENTBANK
  958. ORIENTCERA
  959. ORIENTHOT
  960. ORIENTLTD
  961. ORIENTPPR
  962. ORISSAMINE
  963. OSWALMIN
  964. OUDHSUG
  965. PAEL
  966. PAGEIND
  967. PANACEABIO
  968. PANAMAPET
  969. PANASONIC
  970. PANCARBON
  971. PANCHSHEEL
  972. PANORAMUNI
  973. PANTALOONR
  974. PAPERPROD
  975. PARABDRUGS
  976. PARACABLES
  977. PARAL
  978. PARAPRINT
  979. PARASPETRO
  980. PARRYSUGAR
  981. PARSVNATH
  982. PATELENG
  983. PATINTLOG
  984. PATNI
  985. PATSPINLTD
  986. PBAINFRA
  987. PDUMJEIND
  988. PDUMJEPULP
  989. PEACOCKIND
  990. PEARLPOLY
  991. PENIND
  992. PENINLAND
  993. PEPL
  994. PERIATEA
  995. PERSISTENT
  996. PETRONENGG
  997. PETRONET
  998. PFC
  999. PFIZER
  1000. PFOCUS
  1001. PFS
  1002. PGEL
  1003. PGHH
  1004. PHILIPCARB
  1005. PHOENIXLTD
  1006. PIDILITIND
  1007. PIIND
  1008. PIONDIST
  1009. PIONEEREMB
  1010. PIPAVAVDOC
  1011. PIRGLASS
  1012. PIRHEALTH
  1013. PITTILAM
  1014. PLASTIBLEN
  1015. PLETHICO
  1016. PNB
  1017. PNBGILTS
  1018. PNC
  1019. POCHIRAJU
  1020. POLARIND
  1021. POLARIS
  1022. POLYMED
  1023. POLYPLEX
  1024. PONDYOXIDE
  1025. PONNIERODE
  1026. POWERGRID
  1027. PPAP
  1028. PRADIP
  1029. PRAENG
  1030. PRAJIND
  1031. PRAKASH
  1032. PRAKASHCON
  1033. PRAKASHSTL
  1034. PRATIBHA
  1035. PRECOT
  1036. PRECWIRE
  1037. PREMIER
  1038. PRESTIGE
  1039. PRETAILDVR
  1040. PRICOL
  1041. PRIMESECU
  1042. PRISMCEM
  1043. PRITHVI
  1044. PRITHVISOF
  1045. PROVOGUE
  1046. PSB
  1047. PSL
  1048. PTC
  1049. PTL
  1050. PUNJABCHEM
  1051. PUNJLLOYD
  1052. PURVA
  1053. PVP
  1054. PVR
  1055. QUINTEGRA
  1056. RADAAN
  1057. RADICO
  1058. RAINBOWPAP
  1059. RAINCOM
  1060. RAJESHEXPO
  1061. RAJOIL
  1062. RAJPALAYAM
  1063. RAJRAYON
  1064. RAJSREESUG
  1065. RAJTV
  1066. RAJVIR
  1067. RALLIS
  1068. RAMANEWS
  1069. RAMCOIND
  1070. RAMKY
  1071. RAMSARUP
  1072. RANASUG
  1073. RANBAXY
  1074. RANEENGINE
  1075. RANEHOLDIN
  1076. RANKLIN
  1077. RASOYPR
  1078. RATNAMANI
  1079. RAYMOND
  1080. RBL
  1081. RBN
  1082. RCF
  1083. RCOM
  1084. RECLTD
  1085. REDINGTON
  1086. REFEX
  1087. REIAGROLTD
  1088. REISIXTEN
  1089. RELAXO
  1090. RELCAPITAL
  1091. RELIANCE
  1092. RELIGARE
  1093. RELINFRA
  1094. RELMEDIA
  1095. REMSONSIND
  1096. RENUKA
  1097. REPRO
  1098. RESPONIND
  1099. RESURGERE
  1100. REVATHI
  1101. RICOAUTO
  1102. RIIL
  1103. RJL
  1104. RKDL
  1105. RKFORGE
  1106. RMCL
  1107. RML
  1108. ROHITFERRO
  1109. ROHLTD
  1110. ROLTA
  1111. ROMAN
  1112. RPGLIFE
  1113. RPOWER
  1114. RPPINFRA
  1115. RSSOFTWARE
  1116. RSWM
  1117. RUBYMILLS
  1118. RUCHINFRA
  1119. RUCHIRA
  1120. RUCHISOYA
  1121. RUPA
  1122. RUSHIL
  1123. SABERORGAN
  1124. SABTN
  1125. SADBHAV
  1126. SAGCEM
  1127. SAHPETRO
  1128. SAIL
  1129. SAKHTISUG
  1130. SAKSOFT
  1131. SAKUMA
  1132. SALONACOT
  1133. SALORAINTL
  1134. SALSTEEL
  1135. SAMBANDAM
  1136. SAMBHAAV
  1137. SANDESH
  1138. SANGAMIND
  1139. SANGHVIFOR
  1140. SANGHVIMOV
  1141. SANWARIA
  1142. SARDAEN
  1143. SAREGAMA
  1144. SARLAPOLY
  1145. SARTHAKIND
  1146. SASKEN
  1147. SATHAISPAT
  1148. SATYAMCOMP
  1149. SAVERA
  1150. SB&TINTL
  1151. SBBJ
  1152. SBIN
  1153. SBT
  1154. SCI
  1155. SEAMECLTD
  1156. SEINV
  1157. SELAN
  1158. SELMCL
  1159. SERVALL
  1160. SESAGOA
  1161. SESHAPAPER
  1162. SEZAL
  1163. SGFL
  1164. SGJHL
  1165. SHAHALLOYS
  1166. SHAKTIPUMP
  1167. SHALPAINTS
  1168. SHANTIGEAR
  1169. SHARONBIO
  1170. SHARRESLTD
  1171. SHASUNPHAR
  1172. SHILPAMED
  1173. SHILPI
  1174. SHIV-VANI
  1175. SHIVAMAUTO
  1176. SHIVTEX
  1177. SHLAKSHMI
  1178. SHOPERSTOP
  1179. SHPRE
  1180. SHREEASHTA
  1181. SHREECEM
  1182. SHREERAMA
  1183. SHRENUJ
  1184. SHREYANIND
  1185. SHREYAS
  1186. SHRIRAMCIT
  1187. SHRIRAMEPC
  1188. SHYAMTEL
  1189. SICAGEN
  1190. SICAL
  1191. SIEMENS
  1192. SILINV
  1193. SIMBHSUGAR
  1194. SIMPLEX
  1195. SIMPLEXINF
  1196. SINTEX
  1197. SIRPAPER
  1198. SITASHREE
  1199. SIYSIL
  1200. SJVN
  1201. SKFINDIA
  1202. SKMEGGPROD
  1203. SKSMICRO
  1204. SKUMARSYNF
  1205. SMLISUZU
  1206. SMOBILITY
  1207. SMPL
  1208. SMSPHARMA
  1209. SOBHA
  1210. SOFTTECHGR
  1211. SOLARINDS
  1212. SOMANYCERA
  1213. SONASTEER
  1214. SONATSOFTW
  1215. SOTL
  1216. SOUISPAT
  1217. SOUTHBANK
  1218. SPANCO
  1219. SPARC
  1220. SPECTACLE
  1221. SPENTEX
  1222. SPIC
  1223. SPLIL
  1224. SPMLINFRA
  1225. SPYL
  1226. SREINFRA
  1227. SRF
  1228. SRGINFOTEC
  1229. SRHHLINDST
  1230. SRHHYPOLTD
  1231. SRICHAMUND
  1232. SRSLTD
  1233. SRTRANSFIN
  1234. SSWL
  1235. STAR
  1236. STARPAPER
  1237. STCINDIA
  1238. STEL
  1239. STER
  1240. STERLINBIO
  1241. STERTOOLS
  1242. STINDIA
  1243. STOREONE
  1244. STRTECH
  1245. SUBEX
  1246. SUBROS
  1247. SUDAR
  1248. SUDARSCHEM
  1249. SUJANATOW
  1250. SUJANAUNI
  1251. SUMEETINDS
  1252. SUMMITSEC
  1253. SUNCLAYTON
  1254. SUNDARAM
  1255. SUNDARMFIN
  1256. SUNDRMBRAK
  1257. SUNDRMFAST
  1258. SUNFLAG
  1259. SUNILHITEC
  1260. SUNPHARMA
  1261. SUNTECK
  1262. SUNTV
  1263. SUPER
  1264. SUPERSPIN
  1265. SUPPETRO
  1266. SUPRAJIT
  1267. SUPREMEIND
  1268. SUPREMEINF
  1269. SURAJDIAMN
  1270. SURANACORP
  1271. SURANAIND
  1272. SURANAT&P
  1273. SURANAVEL
  1274. SURYAJYOTI
  1275. SURYALAXMI
  1276. SURYAPHARM
  1277. SURYAROSNI
  1278. SUTLEJTEX
  1279. SUVEN
  1280. SUZLON
  1281. SWARAJENG
  1282. SYMPHONY
  1283. SYNCOM
  1284. SYNDIBANK
  1285. TAINWALCHM
  1286. TAJGVK
  1287. TAKE
  1288. TAKSHEEL
  1289. TALBROAUTO
  1290. TALWALKARS
  1291. TANFACIND
  1292. TANLA
  1293. TANTIACONS
  1294. TARAPUR
  1295. TATACHEM
  1296. TATACOFFEE
  1297. TATACOMM
  1298. TATAELXSI
  1299. TATAGLOBAL
  1300. TATAINVEST
  1301. TATAMETALI
  1302. TATAMOTORS
  1303. TATAMTRDVR
  1304. TATAPOWER
  1305. TATASPONGE
  1306. TATASTEEL
  1307. TCI
  1308. TCIDEVELOP
  1309. TCIFINANCE
  1310. TCS
  1311. TDPOWERSYS
  1312. TECHM
  1313. TECHNO
  1314. TECHNOFAB
  1315. TECPRO
  1316. TEXMACOLTD
  1317. TEXMOPIPES
  1318. TEXRAIL
  1319. TFCILTD
  1320. TFL
  1321. THANGAMAYL
  1322. THEBYKE
  1323. THEMISMED
  1324. THERMAX
  1325. THINKSOFT
  1326. THIRUSUGAR
  1327. THOMASCOOK
  1328. TI
  1329. TIDEWATER
  1330. TIIL
  1331. TIJARIA
  1332. TIL
  1333. TIMBOR
  1334. TIMESGTY
  1335. TIMETECHNO
  1336. TIMKEN
  1337. TINPLATE
  1338. TIPSINDLTD
  1339. TIRUMALCHM
  1340. TITAN
  1341. TNPETRO
  1342. TNPL
  1343. TODAYS
  1344. TORNTPHARM
  1345. TORNTPOWER
  1346. TREEHOUSE
  1347. TRENT
  1348. TRF
  1349. TRICOM
  1350. TRIDENT
  1351. TRIGYN
  1352. TRIL
  1353. TRITURBINE
  1354. TRIVENI
  1355. TTKHEALTH
  1356. TTKPRESTIG
  1357. TTL
  1358. TTML
  1359. TUBEINVEST
  1360. TULIP
  1361. TULSI
  1362. TULSYAN
  1363. TV18BRDCST
  1364. TVSELECT
  1365. TVSMOTOR
  1366. TVSSRICHAK
  1367. TVTODAY
  1368. TWILITAKA
  1369. TWL
  1370. UBENGG
  1371. UBHOLDINGS
  1372. UBL
  1373. UCALFUEL
  1374. UCOBANK
  1375. UFLEX
  1376. UGARSUGAR
  1377. ULTRACEMCO
  1378. UMESLTD
  1379. UNICHEMLAB
  1380. UNIENTER
  1381. UNIONBANK
  1382. UNIPHOS
  1383. UNIPLY
  1384. UNITECH
  1385. UNITEDBNK
  1386. UNITEDTEA
  1387. UNITY
  1388. UNIVCABLES
  1389. UPERGANGES
  1390. USHAMART
  1391. USHERAGRO
  1392. UTTAMSTL
  1393. UTTAMSUGAR
  1394. UTVSOF
  1395. VADILALIND
  1396. VAIBHAVGEM
  1397. VAKRANSOFT
  1398. VALECHAENG
  1399. VALUEIND
  1400. VARDHACRLC
  1401. VARDMNPOLY
  1402. VARUN
  1403. VARUNSHIP
  1404. VASCONEQ
  1405. VASWANI
  1406. VENKEYS
  1407. VENUSREM
  1408. VESUVIUS
  1409. VGUARD
  1410. VHL
  1411. VICEROY
  1412. VIDEOIND
  1413. VIJAYABANK
  1414. VIJSHAN
  1415. VIKASHMET
  1416. VIMTALABS
  1417. VINATIORGA
  1418. VINDHYATEL
  1419. VINYLINDIA
  1420. VIPIND
  1421. VIPUL
  1422. VISAKAIND
  1423. VISASTEEL
  1424. VISESHINFO
  1425. VISUINTL
  1426. VIVIMEDLAB
  1427. VLSFINANCE
  1428. VOLTAMP
  1429. VOLTAS
  1430. VSTIND
  1431. VSTTILLERS
  1432. VTL
  1433. VTMLTD
  1434. VTXIND
  1435. WABAG
  1436. WABCOINDIA
  1437. WALCHANNAG
  1438. WANBURY
  1439. WEBELSOLAR
  1440. WEIZFOREX
  1441. WEIZMANIND
  1442. WELCORP
  1443. WELGLOB
  1444. WELINV
  1445. WELPROJ
  1446. WELSPUNIND
  1447. WENDT
  1448. WHEELS
  1449. WHIRLPOOL
  1450. WILLAMAGOR
  1451. WINDMACHIN
  1452. WINSOME
  1453. WIPRO
  1454. WOCKPHARMA
  1455. WSI
  1456. WSTCSTPAPR
  1457. WWIL
  1458. WYETH
  1459. XLENERGY
  1460. XPROINDIA
  1461. YESBANK
  1462. ZANDUREALT
  1463. ZEEL
  1464. ZEENEWS
  1465. ZENITHBIR
  1466. ZENITHCOMP
  1467. ZENITHEXPO
  1468. ZENITHINFO
  1469. ZENSARTECH
  1470. ZICOM
  1471. ZODIACLOTH
  1472. ZODJRDMKJ
  1473. ZUARIAGRO
  1474. ZYDUSWELL
  1475. ZYLOG
Above is the list of Valid NSE Symbols snapshot fetched at 2nd Jan, 2012 from http://goo.gl/jz7NY

PS: This information is shared with a motive it might be useful, and nothing else.