Sunday, October 20, 2024

New Book - Building Scalable Web Apps with Node.js and Express

 

🚀 I’m excited to announce the launch of my new book, Building Scalable Web Apps with Node.js and Express, now available on Amazon India!

https://m.media-amazon.com/images/S/aplus-media-library-service-media/13a77388-6cad-4282-bfaf-9a141720c45c.__CR0,0,970,600_PT0_SX970_V1___.jpg

Co-authored with Yamini Panchal, this book is designed for developers, architects, and tech enthusiasts who want to master the craft of building fast, efficient, and scalable web applications using Node.js and Express.

📘 What You'll Learn:

  • Architectural Best Practices: Dive deep into the design of scalable systems with real-world examples and practical insights.
  • Efficient Coding with Express: Discover how to leverage Express for creating robust, maintainable applications.
  • Advanced Topics: Covering everything from load balancing and distributed systems to caching and microservices.
  • Hands-on Projects: Build practical, hands-on projects that reinforce each concept as you progress through the book.

Whether you’re just starting out or looking to advance your skills, this book offers something for everyone on the journey to mastering scalable web development.

🎯 Grab your copy today and start building scalable web apps with ease!

👉 Order Now

#Nodejs #Express #WebDevelopment #ScalableApps #TechBooks

Wednesday, September 21, 2016

[Liferay] Multiple Service Builders in one Liferay Project

While working on liferay projects and portlets, sometimes we have a huge amount of service entities and we can logically group them and divide into multiple files and then import into main service.xml file.

Lets take an example of a company intranet where several modules are to be designed to support employee information, leaves, timesheet, projects, performance evaluations, trainings, meeting room booking etc. Our main service.xml will look like

This service.xml file can include any number of entities just like we add in usual way. For specific modules, we have imported children xxx-service.xml files.

The name of the files need not to be in xxx-service.xml, it can be anything that suits you.

The child xml files will contain entities related to a specific module. For example leave management system related service lms-service.xml will be like -

Our performance management system related service pms-service.xml will be like -

Hope this helps you to manage your long entities into smaller xml files and set you up with a short, readable, and organized service.xml. :)

Sunday, January 24, 2016

[ELK] An introduction to ELK Stack - Elasticsearch, Logstash, Kibana


... I had more than 5 servers continuously running on my computer when I was in college. My interest in PHP led me to install Apache, MySQL, and a mail server so that I could host multiple sites from my machine. Apart from these there were ftp, vnc etc. Whenever there was a problem with a server, I would go and check logs of that server. Even till 2 years back, I used to do same for other servers also. This time the list added app servers for Liferay.

I used to check logs for each servers individually, until I got to know about Splunk, a tool which my client was using to see all of the logs at one place. It was providing way more than just view, I could search logs for a defined time range etc. It was not collecting logs from just one server machine but almost 20 machines and for few machines, multiple servers. Troubleshooting was easy this way. I was not logging in to 20 different machines to check logs and look for possible problems. Honestly, I had no access to most of them. But I had access to Splunk, and I could query logs already indexed and could diagnose which server had problems.

But, Splunk (full features) is not free.

Then, what are the other options. Let me give you a hint in the image below..

Google (GOD) gave a hint : Look at ELK, once!

Out of curiosity, I just jumped to knowing ELK Stack rather than checking any search results. Instant, Google said once. :P

ELK = Elasticsearch, Logstash, Kibana.

Below image would tell you how this stack works.

 The above is a very very simplified representation of ELK stack.

Elasticsearch - Indexes the data sent to it. The core of elasticsearch is lucene.

Logstash - Its a data pipeline which can read data from a number of sources. There are more than 200 plugins available for logstash classified in four categories - Input, Output, Filter, Codec plugins.

Kibana - Whatever elasticsearch has indexed, Kibana gives opportunity to visualize that data in different forms. The data can be queried, be listed, be drawn as charts.

This was just a very basic introduction to these components. These components offer a lot more than just these. We will get to learn more about these components in future posts.

Until next time :)



Saturday, January 23, 2016

[ELK] Reading, Indexing and Visualizing Windows logs with ELK Stack

Hello Friends,

A system administrator knows how much system logs can help to troubleshoot critical problems. And what if those logs are indexed at one place and can be visualized in charts etc. Fun, isn't it?

Let's see what can help us to set it up..

1. Nothing special to be done at Elasticsearch just run it by executing elasticsearch.bat from bin directory.

2. Prepare a config so that Logstash can read windows logs. We need to add config in input plugin.

This eventlog plugin helps logstash to read windows logs. Windows logs are stored in binary format and can be accessed using only Win32 API. This plugin takes several configuration options but all of those are optional. These config options are codec, add_field, logfile, interval, tags, type out of which we are using logfile and type.

a) type has no default value and used for filter activation. The given type is stored as part of event and we can search events in Kibana using this.
b) logfile is an array of String and contains Application, Security, System as default values in array. In the config we have used only System.

3. Store this config to a custom file in conf in Logstash with name logstash-windowslogs.conf
and run logstash using

where complete config file will be as -

4. We're done with configuration of Logstash. Next is to read this in kibana. Here is our config to connect Kibana to Elasticsearch to read indices.


5. As soon as we start Logstash, it will start collecting all the events and index them to elasticsearch. Now lets run Kibana by executing kibana.bat in bin directory.

6. Once Kibana is started, we can open its interface in browser using http://localhost:5601/. We need to configure index pattern. By default, it shows logstash-* in index name or pattern field and @timestamp in Time-Field name. Hit create by keeping defaults.

7. Now click Discover on the top menu to see the results. It will show histogram of counts of event.

All the results are also listed below histogram but for security reasons, I have not shown them here..

That's it. Pretty clean, what you say?


Thursday, July 30, 2015

[Forms] Detect enter key pressed using javascript/JQuery

Hi,

Sometimes, we need to detect when Enter key was pressed or some other key as per the requirement. I had a use case where I had a search box and it was not a form. User could put some content in search input text box and when user hits Enter, it should fetch data using ajax call. The code below detects when Enter key was pressed.

We will add a keypress handler to the document which will capture all events whenever a key is pressed. We need to check which one is Enter. The code for Enter is 13.


We also need to check if the focus was actually the search input text.

That's all what we need.

Until next time :)

Friday, July 3, 2015

[Apache] Creating virtual hosts with Apache Httpd web server.

Its common to see multiple project running on one machine. Sometimes, it looks great if every web application runs with its own domain name. I usually keep my hostname as rkg.test, and for projects xxxx.proj. If I have multiple projects lets say blog, forum etc. so I can keep them as blog.proj, forum.proj etc. Neat, isn't it.

Here are simple steps to make it work.
1. Open your hosts file - in C:\Windows\System32\drivers\etc\hosts and add your domain to it. Add the lines at the end of file like this.
127.0.0.1 blog.proj
127.0.0.1 forum.proj

2. Make sure httpd.conf file has virtual hosting enabled. Either add virtual hosts in the same httpd.conf file or put them in a different file and use that. Lets keep them separate in another file and keep this line in httpd.conf
Include conf/extra/httpd-vhosts.conf

3. If already not present, create a folder in conf/ named extra and create a httpd-vhosts.conf file.

4. Add a line for enabling NameVirtualHost
NameVirtualHost *:80

5.  Add a virtual host by adding these lines in httpd-vhosts.conf file
<VirtualHost *:80>
    ServerAdmin webmaster@blog.proj
    DocumentRoot "D:/pFiles/xampp/htdocs/blog.proj"
    ServerName blog.proj
    ErrorLog "logs/blog.proj-error.log"
    CustomLog "logs/blog.proj.log" common
</VirtualHost>

6. Similarly add one more for forum.proj
<VirtualHost *:80>
    ServerAdmin webmaster@forum.proj
    DocumentRoot "D:/pFiles/xampp/htdocs/forum.proj"
    ServerName forum.proj
    ErrorLog "logs/forum.proj-error.log"
    CustomLog "logs/forum.proj.log" common
</VirtualHost>

7. Restart your apache. And open blog.proj and forum.proj in your browser.

Simple enough. Isn't it?

Until next time, C ya. Enjoy.