All posts by Abhishek Nagekar

Event Driven Programming

I remember starting to learn Javascript. It was all a nightmare. Functions used to execute out of order and seemed very random. There was chaos all around, and I was usually left wondering, ‘what is the order of execution here?’. Of course, that question never got answered, because that wasn’t the right question. The right question would’ve been, what drives these functions in this code.

Fortunately, I asked it at an early stage. The things that make Javascript tick are called events. Without these, your browser window is just a dead canvas. Not just the browser, but other Javascript environments as well, work because of events. And when you understand it, events and event driven programming are probably the most intuitive form of developing applications. Events are not difficult to visualize, once you start to relate them to the real world events.

Suppose you’re alone at your home. You’ll need to make sure you open the door in case someone arrives. There are two ways of doing it. First way is to periodically keep checking if someone is outside the door, say every 5 minutes. In the best case, someone will arrive the moment you open the door, but in the worst case, a visitor has to wait for about 5 minutes before you make your next visit. This is just ridiculous. It is inefficient (visitors have to wait for you) and waste of time (when you check but no one has arrived). Who does this? Well, most procedural languages work this way. And it works, for their use cases. But when you’re dealing with real life scenarios, like checking doors or handling HTML form responses in browsers, where the time of completion of an action or arrival of a visitor is unknown, a better approach has to be used, especially when time, muscle and browser resources are so scarce. Here comes event driven programming.

Suppose, you’re alone again, in the same house. But this time, you decide to make use of events. You install a doorbell. Now, whenever someone arrives and rings the bell, you know you have to go open the door. That’s simple. That’s event driven. It doesn’t matter if the next visitor comes after a minute or after an hour. You’ll open the door instantaneously, without waste of your own time. And the fact that most homes have doorbells and this is our natural workflow when dealing with similar situations, is the proof of how intuitive this programming paradigm is.

Okay, that was cool, so far. But I know that I have to listen to a doorbell and then on hearing it, walk to the door to open it. How do I make the machine do that. The answer is callbacks. If you are not quite sure what they are or how they work, do check out my other article on how callbacks and event loop work. So essentially, we attach a ‘callback’ to an event. The callback, as the name suggests, calls back when the event occurs, and lets us control the aftereffect. Such a callback is also called an event listener, which infact is what you attach to the element.addEventListener('event', callback); when working in the DOM. In Nodejs environment, you get a custom EventEmitter class that you can use to emit and listen to events. If you haven’t heard of it, socket.io is all about custom events and the data transfer between the client and the server. It makes use of websockets along with XHRs. If you are into creating sockets and data interexchange between remote hosts like I am, socket.io is for you.

Finally, I hope to have given you some insights of event driven programming with Javascript. Not a lot of code, but that is something you’ll find everywhere. I always like to emphasis on the concepts rather than implementation. It makes for a better implementation.

Gentoo Experience

I started using GNU/Linux full time sometime in 2011. Before that, it was all tiny bits here and there, virtualbox and stuff like that. But then, I finally made my mind to replace the Windows based OS that I was running for Linux. Trust me, it was a very scary decision at that time, and Backtrack 5 was my first full time Linux distro.

From there, it was an ultimate goal to try out every major distro. I used Backtrack till version 5r3 when they dropped support. Then I tried Ubuntu, since I had read Backtrack was Ubuntu based. Switched to Kali Linux, for it was the latest. After that was CentOS 6. I remember using it because I had the same on my DigitalOcean VPS. VPS gone, it was time to switch. I tried Fedora, but GNOME 3 didn’t appeal to me, and still at this point, I was unaware of what a desktop environment was, or how do I change one. Later I switched to Debian 7, which was when I fell in love with it. Attended the launch party of Debian 8 and started using it right away. Debian stable had some really outdated packages, and the GNOME environment didn’t appeal to me either. I installed Debian testing, running XFCE. It was the setup I always needed and is still my daily driver. In the meantime, I tried OpenSUSE too, but nothing great. As you see, I don’t really need a reason to switch distros.

What really provoked me was a sentence that I read somewhere, ‘if you have not used Slackware or Gentoo, you really have not used Linux’. It was 3 years ago. Since then, I had multiple tries at installing Slackware, Gentoo and Arch, but somehow, I would mess some thing up and it would be a failure. You should note here that installing these systems is not as easy a task, especially when you don’t understand the commands you are typing. This time was nothing different. I downloaded the ISO which was surprisingly just some ~250MBs. I burnt it to a disk and booted it up.

The most scary part of it all is that there is no ‘installer’. You have to manually create partitions, the swap and the bootloader. Then there is the phase to download the ‘stage3’ file to the root and untaring it. Chrooting into it, setting up a few things like network, downloading the kernel source, compiling it and adding it to the /boot directory and then the hardest stage. Reboot. Hard emotionally, that is. After a minute, you are either greeted by the login prompt, or you realize that the past 8 hours of your life went in vain.

Luckily for me though, it booted up. It was after midnight when it booted up, and I was installing it since 2pm. I’m a noob. It is not a lot different than other distros, if you know how to make softwares. There is this package manager called emerge which is helpful too (it downloads sources and builds it, resolving dependencies). I am currently running it command line, as I didn’t setup the Xorg during installation. It is pretty usable, and the best part is that you actually understand your system. I realized that this is a great way to actually learn GNU/Linux. The thing that I have to live with is that since each and every installation is by compiling source, it is damn slow. Usually 15-30 minutes for most applications. The plus point is that since it is build on my particular system, the executable is smaller than usual (and I read it is faster too). Cool, right?

I am looking forward to using it everyday on my PC, at least for sometime till I get really good at it. Maybe then I’ll try doing the same on my laptop, which would be nice. For now, I’ll have to figure out a way to install Xorg.

Going Full Stack On Mobile

I have been a part of numerous web projects, and after these many applications using multiple languages and frameworks, when my friend Kunal suggested that we should develop an Android application that would make use of a backend and stuff like that, I couldn’t help but desperately show my excitement. I told my friends, family and even seniors about the thing I am working on. After all, it was my first time going full stack mobile.

For clarifying things, I am still doing the backend. It is my little team that has gone full stack. On mobile. The application, called the moodShare will enable a user to do few simple tasks like registering, logging in, adding friends, sharing statuses, reading statuses and stuff like that. Each task is made possible with a simple API that is made without much thought to future scaling. I know, that is really bad from my side. But then, I was too desperate to get an application up and running that I neglected a lot of issues that arose when we were developing.

The application’s backend was Mongo, Express and Node. We deployed it to Modulus.io but then this happened and we had to shift it to Heroku.

Too poor to get a proper server… Crashing in 3..2..1 #javascript #nodejs

A photo posted by Abhishek Nagekar (@abhisheknagekar) on

And this was when we had around 4 live users. The code quality was poor, very poor. Trust me when I say it. 4 users were hitting the database around 7 times a second.

Beautiful sight of requests hitting the API server #javascript #nodejs

A video posted by Abhishek Nagekar (@abhisheknagekar) on

Then there were many other issues as well. For example, if you added a friend who had 400 status updates in his or her name before you added, your phone would notify you, well, 400 times. Literally. That was awful. Push notification was implemented in a way you write an infinite for loop to check if something has changed.

But hey, this was our first time. We did enjoy each and every second of the development (around 4 days), the bugs just as much. We did learn a lot from it. Before this project, messaging queue was just a name, Google Cloud Messaging was just another service that I knew nothing about, three or four HTTP status codes that I knew. The satisfaction of knowing more about these was worth it all. Lastly, we did not abandon the project. Our alpha version was pretty stiff and unusable. Hopefully, with better APIs, better hosting and material design on the frontend, we plan to rock with the beta version. Stay tuned.

Screenshots



Links

Callbacks And Event Loop Explained

Callbacks are the things that make writing Javascript fun. If you think you are better off writing Javascript the way you wrote C and Java, then you are not really feeling the essence of this beautiful language. I found it rather hard to get used to this style of programming, called affectionately as the non-blocking style. What is this difficult sounding technique, and how can you make use of it to write better code is what we’re going to see in this little essay.

Starting with the most basic explanation of what a callback is. Suppose I have two functions A and B. Although I can execute both of them myself, what I can do is execute A and ask A to execute B for me. Yes, that is all there is to the callbacks thing. I have two functions, I execute one and then ask it to execute the other one for me. Simple.

But, you might ask, why I would ever want to do such a thing. Well, think about it this way. Computer processing is way, way faster than any I/O in question, especially when the I/O is on the other side of the network (or globe, for that matter) which is usual in the case of most web applications. Now, if you write code such that it queries a remote (or local) database and uses those results to carry on further processing, it results in a bottleneck. The processor executes instructions in the order of nanoseconds. An I/O request to the disk or network takes a few milliseconds at best, or a million times more time than what a processor needs to do a single atomic task. So essentially writing code that waits for results from I/O is simply wastage of the precious CPU cycles that could’ve been put to better use in the meantime.

A callback is also of great use when you want to make a function, say A, do things according to the context by supplying a specialized function B at the runtime. For example,

So let’s get into some (pseudo) code. We will first see what it is like to write code without callbacks and blocking IO and then examine some issues. Then we will write the same code using callbacks and non-blocking IO and see if we have rectified (or at least mitigated) those issues.

Now this code is blocked when the Javascript engine is waiting for the Mongodb query to execute and return. It may take a few milliseconds, and only after that does the control of the program moves to the next lines. If the database hangs up for any reason, then the response will never reach the end user, regardless of whether the login credentials were correct or not (or let the user know something went wrong). It may also happen that the Javascript engine continues the execution with ‘undefined’ returned, resulting in falsy block being executed each time. Having addressed these problems, let us now write the asynchronous version of the same (pseudo) code.

The callback function in a way keeps waiting for the response from the database in the background white the flow of program is not interrupted. As soon as the reply comes, the code in the callback function is executed. That sounds neat. Wait a second. Didn’t we all learn that Javascript is single threaded and does a single job at a time and what not? Then how is the callback supposed to listen when the browser is already executing the code below it?

To understand that, you’ll have to stop believing every thing you write in Javascript is Javascript. Yes, I mean it. Javascript is single threaded, but there is much more to Javascript than just a single call stack. Ok first, what is a call stack. Whenever a javascript file is executed, the engine creates a context for the code to run. This is the global execution context. It sits at the bottom of the call stack. Whenever a new function is invoked (or ‘called’), a new execution context is pushed on top of the stack. After the execution completes, the execution context is popped off the stack. After the last line of code is interpreted, the global context is popped off the stack.

Well and good. But what about that setTimeout function that you set for the next 10 seconds? And our own mongodb query. Where did those go? The setTimeout is actually a webAPI provided to us, the Javascript developers, by the browser. Similarly, our mongodb query command was not a Javascript thingy, but a C++ API from the Nodejs bag of goodies. These events are handled by their respective environment, and when they finish execution, they enter what is called the callback queue. There, they wait until the call stack is completely empty. Once the call stack in empty (all Javascript code is done running), the event loop kicks in, pushes the callback functions waiting in the call back queue on top of the stack, one by one. A great way to visualize what the above paragraph just said is to try it out yourself at loupe by Philip Roberts. He’s an amazing guy and you should totally check out his talks.

So that was it for this little article. Hope to have helped you. Keep digging.

AJAX Loading

Edit: The section below this has some major issues, that I only found out after pushing into github. Read edit section.

I always admired the sites which did not reload upon clicking links here and there, but simple gave me whatever I requested for, with grace. Sites like Facebook, Twitter and Youtube extensively use it, and the application hardly reloads after the initial reload. Here’s my attempt on making something similar.

Firstly, I downloaded a free HTML template from Online Casinos. The links had to be edited. The pattern I used was to replace, for example, any (internal) link pointing to example.html to #/example.html. What it did was 1.) prevent the page from reloading, and 2.) putting the requested location to some place from where I can extract it with Javascript.

Next, grab that location of the hash. The Javascript’s built in method window.location.hash provides the page’s current hash. We drop the first char from the returned hash as it is the hash (#) character itself. The hash value is essentially the location to be navigated to, hence, we use some string manipulation to craft a new URL from it, and make an AJAX request to fetch the contents of the page requested, and insert it into the div specified through Javascript.

For example, suppose we are on about.html page. Now if I click ‘contact’ link, the link would change to /about.html#/contact.html, the hash value would be grabbed by the window.location.hash method, a GET request is fired to get that location with a XMLHttpRequest object, the response is inserted with innerHTML method on the div, and using the window.history.pushState() method, we update the URL to /contact.html, to ease sharing. All this, in the blink of an eye. Cool.

I could do it in two different ways. One was to just place the text in those files and inserting them into the template. The drawback of this was that if a user directly visits that page, he/she would just see plain text. For example, check this link out. The Javascript code for that was,

The separate callback function for getHash was necessary because onclick() event fires before the hash is changed, and it would result in the script grabbing the previous hash. As already mentioned, the navigation is smooth if one visits index.html and then other links, but try getting the about.html page first. All you’ll see are a few lines of text.

A sloppy solution to the above problem was to replicate the entire index page with varying content for each of the page, and then replace entire DOM on anchor click. You can imagine this is not at all smart, and would destroy the point of this site. Not to mention the extra bandwidth to reload the same resources on every request. Here is the implementation for this method.

What would be the smarter way? Perhaps, making sure the first page to load always downloads the entire html every time, and subsequent requests to be made to download the changed content. I will be implementing this solution shortly. Drop in your ideas too, and share if you enjoyed reading this.

Edit

So, the above code seemed to work on my localhost, but there were some serious (cross origins and some logical ones) issues with it, so much that I considered to learn relative and absolute links all over again. Anyways, I also happened to redesign my blog during this time (redesign in the sense changing the template), and I am working to implement AJAX loading on all links here on my blog. So far, all links on the home page seemed to work as expected (AJAXified). I am using the document.write() method, so judge me on that. You can check them out here. The code snippet is following.

What do you think about it? It feels a little buggy at times, and sadly, Internet Explorer isn’t supported yet. But for most of it, chrome and firefox are working like a charm. Hope to get all links covered under this thing, which would be nice. Sadly though, the browser tab shows loading for all the third party resources on the page which kill the purpose of this thing. Anyways, feel free to correct or improve this in the comments below.

Exchanging Chikoos For Friends

Traveling to college everyday, in a train compartment, sitting on the same seat for about an hour, your entire attention is on your phone of course. Yes, it was surely enjoyable in the early days, but after a while, you don’t get to see anything unusual most of the times. That is the reason if you ever spot me in a train compartment, I would be drowned into some book or my mobile phone.

It was a similar day today, morning 9 am. I was sitting by the window, enjoying the newest mix-tape I downloaded. I was into my phone screen when I noticed a ‘Chikoo’ seller making his way through the boogie. I glanced at him through my peripheral sight and was soon back on the phone screen. I noticed the man sitting on the other side bought some 8 chickoo for Rs. 10, ‘Nice deal’, I thought, ‘but cannot carry them to college’. Just then, the man, who bought those Chikoos, started distributing those in between us, the people sitting next to him. I stared at him for a second, this time a bit carefully. He was a well groomed individual, in around his late 40s. He was the typical man you find in countryside India, and his preference for white outfit depicted that well.

He offered me chikoo, and I took it without any hesitations. It was real sweet. He gave away 7 of the 8 he had to the people around him, relished the one he had left in his hand, and started talking to the people, total strangers. He said that he didn’t have a native place, and that makes him feel every place like his own. He chuckled. The people around him started with their stories and it was fun to hear. There were some five to six of them who were chatting like they were some long lost friends, when in fact, they had just met a few minutes ago. The smiles and the laughter I witnessed were something unforgettable. They spoke for sometime after that, while I, finding no room to add anything in those elderly talks, decided to put the music back on.

It did, however, teach me how important it is to give away, and how much joy it gives to the people who receive it. I learned that I need not have a bucket full of cash to bring a smile on someone’s face. It can be done with some 10 bucks too. Plus what you get are some friends, who have a great first impression of yours in their minds. It is a small price to pay for those many returns.

It is funny how a random incident changes your point of views to that extent. People like these exist in our World, who call the World their home, and the people their very own. The World and Mother Nature are really our very own. It is just that we often forget it in our fast paced lives. Have a great time ahead.

Private Local Cloud Storage Using Raspberrypi – How To

Today we’ll see how you can home brew a cloud (in the local sense) storage solution that would be free to use, quite faster than an Internet based one, secure enough from any outside the network intrusion and customizable.

But why do you need to go to such lengths when you can easily create an account on Google and get 15 gigs of free storage. Well, first of all, data that we generate is increasing significantly each day. We have multiple devices with us, most of them with ~16-64 GB storage, which is not at all good enough. Then while our notebooks are getting faster with solid state drives, they are still costly to use for all of our needs like storing tonnes of movies and music videos, that is, if you are still left with space after cramming up your disk with camera pictures. If you opt for a premium account at Dropbox or Google Drive, it will easily cost you ~$100 a year recurring, the cost which can get you a 2 TB WD external hard disk.

Then there is the speed issue. At least here in India, we are deprived Internet connection faster than 2-4 mbps. Most of the times even less than that. Even if we considered the option of backing to an online cloud storage, the bandwidth prevents us from efficiently using what already exists for free. When using an local cloud, the bandwidth is only throttled by your equipments, and most of the times you can easily get ~40-60 mbps, which is fine.

The last issue, depending on how you see it, is the most and the least important. Security. If the files are going to be random movies and music videos, you might not be much worried about some hacker breaking into your cloud storage provider and downloading them, but on the other hand, if the files contain any kind of sensitive personally identifiable information, then you would worry. But having said that, I would always choose a secure storage solution from insecure ones if given an option, even if the data was not at all sensitive.

Things you’ll need

Now that we’ve discussed some merits and demerits, lets talk about building the thing. The things you’ll need are,

  • Raspberry Pi (with all it’s setup accessories), with Ethernet port
  • Hard disk, any capacity, with SATA to USB converter
  • Wireless router
  • Ethernet cable or Wifi adapter
  • USB power hub [in some cases]

Setting up the hardware

  1. Connect the hard disk to the Raspberrypi
  2. Boot it up and login via ssh
  3. Run sudo fdisk -l and make sure the hard disk is shown. Note the device name (/dev/sdb or similar)
  4. If not, try usb power adapter
  5. If it is showing, we’ll have to make sure it mounts to the same location each time we boot up.
  6. Create a folder for the mount point. I’ll be using
    /var/www
  7. It would be advisable to use a separate low privileged user for the process, since we will be changing the user home later on.
  8. sudo chmod 775 /var/www

    and

    sudo chown your_username /var/www

    to set the permissions for reading, writing and executing.

  9. sudo blkid

    and note the uuid for the external hard disk. Copy it.

  10. Now we need to make the mounting occur each time we boot the pi up. Open the fstab file by
    sudo nano /etc/fstab

    and add the following line

    UUID="3b28d90f-8805-4ec4-978d-c53ee397a924" /var/www ext4 defaults,errors=remount-ro 0 1

    by editing the UUID, mount location and file system and keeping other things constant.

  11. Reboot the pi, and your /var/www should now be pointing to the external hard disk. If so, you are done with this part of the tutorial. If not, check what did you miss. Also make sure you are able to read and write files to that directory from your user account. If not, recheck the steps, Google for solutions or comment for help.

Setting up the FTP server

  1. sudo apt-get install vsftpd

    to install the vsFTP server.

  2. Open the vsftp configuration file by
    sudo nano /etc/vsftpd.conf
  3. The would be a lot of options. Just go through and make sure the following lines are there and not commented. If not, add them.

    	anonymous_enable=NO
    	local_enable=YES
    	write_enable=YES
    	chroot_local_user=YES
    	force_dot_files=YES
    	local_root=/var/www
    	allow_writable_chroot=YES
    
  4. After saving (Ctrl + x and then y) and exiting, restart vsftpd by
    sudo service vsftpd restart
  5. Lastly, change the user home to the FTP root, so that you’ll directly get into the FTP server’s root on logging into the FTP client.
    sudo usermod --home /var/www/ your_username

If all went well, we have a 100% working local cloud storage running off our pi. Now, since not everyone would like to login with terminal each time they wish to access the cloud, I make some customizations to make it easy for even my Mom and Dad to use the cloud.

On the desktop, download and install filezilla.

sudo apt-get install filezilla

should do it on deb derivatives. Create a launcher icon that triggers the command

filezilla sftp://myUsername:myPassword@myIP:myPort/my/root

which in my case became

filezilla sftp://abhishek:password@192.168.1.10:22/var/www

.


On the mobile phones (we have droids, three of us), I used the ‘add ftp server’ option in the ES File Explorer and created a shortcut on the home screen with the widgets menu. Hence, accessing the cloud was nothing more troublesome than accessing a local folder on the phone.



Now I have my very own, secure, high speed cloud storage solution for all my devices and also for the family. It is really convenient and building a custom case for the thing, it looks pretty badass.

What do you think?

Fastboot Horror

This was the second day entirely wasted to get my external hard drive to work with the USB 3 ports on my laptop. It just refused to get detected. It worked fine on the USB 2 port, but just didn’t read on the 3. Initially thought it was a Thunar issue on XFCE, but there simply wasn’t any drive in the output of fdisk -l. Read up dmesg multiple times and there was this line consistently,

[sdb] Synchronize Cache(10) failed: Result: hostbyte=DID_ERROR driverbyte=DRIVER_OK

Googled it and read every thread on the first page of the search results, literally. I had started to doubt the USB 3 ports on the notebook now. I am on Debian testing, and thought something has broken down at the kernel level. I immediately started to download openSUSE, to see if it really is a kernel bug, because I am not brave enough to switch kernels. Anyways, I thought, let me look if the BIOS is reading the drive, and boy, what do I see. The BIOS is just not recognizing the drive. Now I began to panic. It really looks like a hardware issue.

In between my googling, I came across a page that provided some information. Some good guy had asked it for his Windows 8.1 laptop. There was an accepted answer. There were these simple steps, go to BIOS, find Fastboot, disable it. Aha! I said. How did I not think about that myself. Did it, and it was working again, like it should have. Fastboot does save a second each time I turn on my notebook, but this time, it costed me 2 full days. Lesson learnt. When the guys at elite forums say fastboot will prevent some hardware from being read and tested on boot, they aren’t just putting a nominal warning on the door, that thing is real. “Want to make your PC boot faster? Enable fastboot”. No thanks.

The Midnight Symphony

Its 4:07am of 2nd January on my watch and I have started to feel a little sleepy, to be honest. I have been awake this long once before, and coincidently, it was on the same platform, in the same conditions, but for a different reason. This time, it is my train back home that got late. A bit too late.

I am not a night owl, just so that you know. Staying awake beyond 1am is a real deal for me, unlike most of my friends and maybe, you. “Hey, but all computer people work at night. That’s the rule.”. Well, sorry. I am that black sheep. Its been around 7 hours since I arrived on Karwar platform. I had my train at 1:28 am, which was already too late in the first place. I hardly thought I would be able to make it till 1:30, but then, here I am.

You know something amazing? The Earth rotates. And just because of that, you can sit at a place facing east, and have a large portion of the night sky pass slowly right in front of you. Super cool it feels, to watch Venus rise, followed by half Moon and then the red Mars, all surrounded by hundreds of stars, twinkling, right in front of you. The station is far off any junction, and the silence of the night is occassionally disturbed by an ongoing express train, whose whistle you can hear right from when it is kilometers away. Can you imagine the silence. The cold.

I am hardly able to type with my fingers. It is really very cold. My CPUs are at 24 and 26 degree celsius respectively, which pretty much sums up the surrounding temperature. I am wearing my napkin as a mask on my face with my hoodie on my head, to prevent my ears and nose from getting so cold that I fear touching them, seriously. In fact, when I wore the napkin on my face, my nose started to kinda defrost. Shit. That is bad. The soft breeze that comes in once in a while adds to the bone crackling cold. I had tea at 2, and the vapors that came out of the cup felt nice on my cheeks. Warm and real good.

I sometimes wonder, does every stomach makes dinosaur sounds when hungry? Mine does, and to avoid the public embarassment, I grabed some chips from a nearby 24/7 food stall on one side of the platform. Untill 12, I was constantly checking my phone for the live status of the train, covering myself with a sheet of cloth my mom caringly gave to me, trying to accommodate my back on the little wooden bench on the platform, but then I thought to use this rare moment on the platform to make a little note.

The Garibrath Express is just leaving the platform. I am a bit attracted to these fully airconditioned express trains. I want to travel in one too. Look at how magnificiant this one looks, damn! There are some 50 passengers with us on the platform, but what is really worth watching is the night station staff. They are functioning quietly, like an army of ants working without anybody noticing. I just came to know some of their everyday post-mid-night tasks. The respect rose several folds. While we sleep peacefully in the midst of the night inside the cozy compartment of our couch in the Express train, these are the people who make sure the sleep isn’t interrupted, and boy they are working real good.

I am glad I had Misty here with me this time. Last time on this platform, I didn’t have her. I might have experienced an amazing night, but I don’t have any record of that 3 year old event. The trees infront of me some 50 yards away are so quiet, you can actually see their eyes closed. It is a dense forest on the other side, with nothing for kilometers except these trees.

The clock on the platform is the one used by railways around a decade ago, those black and white analog ones. When I was a kid, I always asked Mom why didn’t the platform watch have a seconds hand. I generally don’t ask such questions these days, but I get reminded of them whenever I see those pretty objects. The amazing thing about night is that people become very easy to talk to. Everyone here is going through the same mindset, and exploiting that common linkage to explore the other uncommon corners of human nature is interesting. The station master here is friendly too. He lets me look at his console which has little LED indicator lights that show incoming and outgoing traffic. His alarm bell rings when any train is incoming and he clears the route for it on his console. Pretty neat. His dot matrix printer is making some sound, similar to the one we have in our college labs, but this one here sounds sweeter. Maybe it’s just me. Master also flashes green lights to trains indicating everything is alright. It feels weird to see this human interaction, inspite of the rest of the stuff computerized. The smile that the guard of the passing train gives is yet another treat. Another train is arriving. Can you hear the tracks making that typical squeaking sound. I am yet to find out what that actually is. But that’s the magic of it.

Its 4:52 and I am still writing. There are so many things that demand a mention. The Railway Police Guard standing confidently besides the still train is thinking about something. I wonder what that might be. The best thing about a railway platform is, it feels like a mini democracy. There are people from every section of the society, eating the same sandwich and drinking the same masala tea.

I think it is time for my second cup of tea as well. Thank you for being with me guys, have a great weekend. See you then.

Yet Another Happy Year Passed

Howdy folks, first of all, wishing you a very happy new year. It has been a long year, in a purely poetic sense. Life went through some major ups and minor downs, but all in all, it was a great year. I would be summarizing few of the great moments that I remember in the following paragraphs.

The year started with a lot of confusion. I was going great with Python, but something felt missing. I missed C. I knew C was not the right thing for me, and C++ looked intimidating. I needed something that would back me well for the code competitions I participated in, so C++ or Java seemed to be the only way out. I choose C++, for it looked similar to things I already knew.

C++ was a great choice. I got going with it quite smoothly. I learnt the fairly advanced concepts in it, and implemented some algorithms and applications using it. But the results of the coding events I participated in didn’t change much. I really sucked at them, sometimes unable to solve even 1 of the 8 problems given to be solved in the 8 to 24 hour time limit. Felt ashamed.

During march, we had our college technical festival, a.k.a. Technitude. All of a sudden, I and my mate Aditya found ourselves handling two of the technical events namely ‘Code Wars’ and ‘Debugging’. The events were just what they sounded like and it went fairly well. I had never before been the responsible person for any such event in the past, and executing it successfully was something I was proud of. I even made great contacts with my seniors at the same time, who later became the people who guided my with many important decisions.

The Technitude was great, and tiresome. The committee members and we, the volunteers, worked hard to make it work. Things concluded with the afterparty, which sadly, I wasn’t a part of. Anyways, things went pretty fast after the technitude. Fourth semester came and went in no time. I cleared it, marginally. C++ was still on, in the background. I bought ‘The C++ Programming Language by Bjarne Stoustrop’ and started to read it.

In the post-semester holidays, there were a couple of other competitions, which I failed, yet again. The C++ learning wasn’t helping me with the competitions as much as I expected. Probably because algorithms was what I lacked, and not a language.

Around in June, the startup fever took over. More than a startup, it was for refreshing my Python skills, learning a web framework called Django, and creating my first serious RESTful application. It took about a month for me to realize that a social networking backend was not really a one man business, at least not if I wanted to build it up in under a year. I deserted that ship, and started with another application along with my friend, this time entirely in Javascript. That was when the love for Javascript took over. I realized that after all these languages I have learnt, Javascript was something different. It required my to completely empty my brain of existing rules and guidelines, and start afresh. My love for Javascript will take up an entirely new article, so I am leaving it here.

In around late August or September, we had our committee elections. I was elected the Technical head, and then our committee went further to deliver an amazing Engineers’ day. Javascript and my new startup were in the making. Later, we got Rajashree in our team, who looked at the frontend and UI stuff. We got Nikhil, who joined the frontend team. Harshal, a budding lawyer, joined us later in November. We are five now, and while we don’t work as consistently these days, we still call ourselves the Cherrylogs Dev Team, which is nice.

In the first week of October, I got the chance to develop the website of FabIndia’s Cooltobeindian initiative. Although I and Rajashree did all that we could, they were not satisfied, and we did not get paid. Anyways, I will leave that for a separate article as well. That incident taught me a lot of things, especially to say ‘No’. Now, I am much comfortable in saying No to people and things that I know are not going to be worth the efforts. Result? I get much more time for things that actually make me happy, the ones that are worth it, the 20% that matters.

Then there were submissions, practicals and examinations. I got a Laptop in late October, which was generously given to me by Vijutai, my aunt and my mom. The post-semester holidays were spent on Javscript and Cherrylogs. I and Kunal did some work, but not a lot of it. On 23rd December, I left for Murdeshwar, Karnataka. An awesome place to spend a day or two. Then I came back to my native, Karwar, and now I am here writing this post, sitting in the hall room, thinking about the wonderful year that I have just passed.

Wish you all a very happy new year, yet again.