Sunday, October 20, 2024

New Book - Building Scalable Web Apps with Node.js and Express

 

🚀 I’m excited to announce the launch of my new book, Building Scalable Web Apps with Node.js and Express, now available on Amazon India!

https://m.media-amazon.com/images/S/aplus-media-library-service-media/13a77388-6cad-4282-bfaf-9a141720c45c.__CR0,0,970,600_PT0_SX970_V1___.jpg

Co-authored with Yamini Panchal, this book is designed for developers, architects, and tech enthusiasts who want to master the craft of building fast, efficient, and scalable web applications using Node.js and Express.

📘 What You'll Learn:

  • Architectural Best Practices: Dive deep into the design of scalable systems with real-world examples and practical insights.
  • Efficient Coding with Express: Discover how to leverage Express for creating robust, maintainable applications.
  • Advanced Topics: Covering everything from load balancing and distributed systems to caching and microservices.
  • Hands-on Projects: Build practical, hands-on projects that reinforce each concept as you progress through the book.

Whether you’re just starting out or looking to advance your skills, this book offers something for everyone on the journey to mastering scalable web development.

🎯 Grab your copy today and start building scalable web apps with ease!

👉 Order Now

#Nodejs #Express #WebDevelopment #ScalableApps #TechBooks

Wednesday, September 21, 2016

[Liferay] Multiple Service Builders in one Liferay Project

While working on liferay projects and portlets, sometimes we have a huge amount of service entities and we can logically group them and divide into multiple files and then import into main service.xml file.

Lets take an example of a company intranet where several modules are to be designed to support employee information, leaves, timesheet, projects, performance evaluations, trainings, meeting room booking etc. Our main service.xml will look like

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE service-builder PUBLIC "-//Liferay//DTD Service Builder 6.2.0//EN" "http://www.liferay.com/dtd/liferay-service-builder_6_2_0.dtd">
<service-builder package-path="me.rkg.apps.hrms">
<author>Ravi Kumar Gupta</author>
<namespace>HRMS</namespace>
<!-- Entities start here -->
...
<!-- Entities end here -->
<service-builder-import file="lms-service.xml"></service-builder-import>
<service-builder-import file="eis-service.xml"></service-builder-import>
<service-builder-import file="pms-service.xml"></service-builder-import>
</service-builder>
This service.xml file can include any number of entities just like we add in usual way. For specific modules, we have imported children xxx-service.xml files.

The name of the files need not to be in xxx-service.xml, it can be anything that suits you.

The child xml files will contain entities related to a specific module. For example leave management system related service lms-service.xml will be like -

<?xml version="1.0" encoding="UTF-8"?>
<service-builder package-path="me.rkg.apps.hrms">
<entity name="LeaveEntry" remote-service="false" local-service="true">
<column name="entryId" type="long" primary="true"></column>
...
</entity>
<entity name="LeaveTypes" remote-service="false" local-service="true">
<column name="id" type="long" primary="true"></column>
...
</entity>
</service-builder>
view raw lms-service.xml hosted with ❤ by GitHub
Our performance management system related service pms-service.xml will be like -

<?xml version="1.0" encoding="UTF-8"?>
<service-builder package-path="me.rkg.apps.hrms">
<entity name="EvaluationParam" remote-service="false" local-service="true">
<column name="paramId" type="long" primary="true"></column>
...
</entity>
<entity name="EvaluationCycle" remote-service="false" local-service="true">
<column name="cycleId" type="long" primary="true"></column>
...
</entity>
...
</service-builder>
view raw pms-service.xml hosted with ❤ by GitHub
Hope this helps you to manage your long entities into smaller xml files and set you up with a short, readable, and organized service.xml. :)

Sunday, January 24, 2016

[ELK] An introduction to ELK Stack - Elasticsearch, Logstash, Kibana


... I had more than 5 servers continuously running on my computer when I was in college. My interest in PHP led me to install Apache, MySQL, and a mail server so that I could host multiple sites from my machine. Apart from these there were ftp, vnc etc. Whenever there was a problem with a server, I would go and check logs of that server. Even till 2 years back, I used to do same for other servers also. This time the list added app servers for Liferay.

I used to check logs for each servers individually, until I got to know about Splunk, a tool which my client was using to see all of the logs at one place. It was providing way more than just view, I could search logs for a defined time range etc. It was not collecting logs from just one server machine but almost 20 machines and for few machines, multiple servers. Troubleshooting was easy this way. I was not logging in to 20 different machines to check logs and look for possible problems. Honestly, I had no access to most of them. But I had access to Splunk, and I could query logs already indexed and could diagnose which server had problems.

But, Splunk (full features) is not free.

Then, what are the other options. Let me give you a hint in the image below..

Google (GOD) gave a hint : Look at ELK, once!

Out of curiosity, I just jumped to knowing ELK Stack rather than checking any search results. Instant, Google said once. :P

ELK = Elasticsearch, Logstash, Kibana.

Below image would tell you how this stack works.

 The above is a very very simplified representation of ELK stack.

Elasticsearch - Indexes the data sent to it. The core of elasticsearch is lucene.

Logstash - Its a data pipeline which can read data from a number of sources. There are more than 200 plugins available for logstash classified in four categories - Input, Output, Filter, Codec plugins.

Kibana - Whatever elasticsearch has indexed, Kibana gives opportunity to visualize that data in different forms. The data can be queried, be listed, be drawn as charts.

This was just a very basic introduction to these components. These components offer a lot more than just these. We will get to learn more about these components in future posts.

Until next time :)



Saturday, January 23, 2016

[ELK] Reading, Indexing and Visualizing Windows logs with ELK Stack

Hello Friends,

A system administrator knows how much system logs can help to troubleshoot critical problems. And what if those logs are indexed at one place and can be visualized in charts etc. Fun, isn't it?

Let's see what can help us to set it up..

1. Nothing special to be done at Elasticsearch just run it by executing elasticsearch.bat from bin directory.

D:\Prep\ELK\elasticsearch-2.1.1>bin\elasticsearch.bat
[2016-01-23 15:06:38,555][WARN ][bootstrap ] unable to install syscall filter: syscall filtering not supported for OS: 'Windows 7'
[2016-01-23 15:06:39,281][INFO ][node ] [Wild Child] version[2.1.1], pid[17364], build[40e2c53/2015-12-15T13:05:55Z]
[2016-01-23 15:06:39,282][INFO ][node ] [Wild Child] initializing ...
[2016-01-23 15:06:39,592][INFO ][plugins ] [Wild Child] loaded [], sites [kopf]
[2016-01-23 15:06:39,641][INFO ][env ] [Wild Child] using [1] data paths, mounts [[New Volume (D:)]], net usable_space [71.4gb], net total_space [270.4gb]
, spins? [unknown], types [NTFS]
[2016-01-23 15:06:43,299][INFO ][node ] [Wild Child] initialized
[2016-01-23 15:06:43,299][INFO ][node ] [Wild Child] starting ...
[2016-01-23 15:06:44,750][INFO ][transport ] [Wild Child] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::1]:9300}
[2016-01-23 15:06:44,761][INFO ][discovery ] [Wild Child] elasticsearch/NU9kgl8jQcuPJeiDLj4tMg
[2016-01-23 15:06:48,840][INFO ][cluster.service ] [Wild Child] new_master {Wild Child}{NU9kgl8jQcuPJeiDLj4tMg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(ele
cted_as_master, [0] joins received)
[2016-01-23 15:06:49,072][INFO ][gateway ] [Wild Child] recovered [0] indices into cluster_state
[2016-01-23 15:06:51,750][INFO ][http ] [Wild Child] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::1]:9200}
[2016-01-23 15:06:51,751][INFO ][node ] [Wild Child] started
2. Prepare a config so that Logstash can read windows logs. We need to add config in input plugin.

input {
eventlog {
type => 'Win32-EventLog'
logfile => 'System'
}
}
view raw eventlog.conf hosted with ❤ by GitHub
This eventlog plugin helps logstash to read windows logs. Windows logs are stored in binary format and can be accessed using only Win32 API. This plugin takes several configuration options but all of those are optional. These config options are codec, add_field, logfile, interval, tags, type out of which we are using logfile and type.

a) type has no default value and used for filter activation. The given type is stored as part of event and we can search events in Kibana using this.
b) logfile is an array of String and contains Application, Security, System as default values in array. In the config we have used only System.

3. Store this config to a custom file in conf in Logstash with name logstash-windowslogs.conf
and run logstash using

>bin\logstash.bat -f conf\logstash-windowslogs.conf
io/console not supported; tty will not be manipulated
Settings: Default filter workers: 2
Logstash startup completed
where complete config file will be as -

input {
eventlog{
type => 'Win32-EventLog'
logfile => 'System'
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
4. We're done with configuration of Logstash. Next is to read this in kibana. Here is our config to connect Kibana to Elasticsearch to read indices.

# Kibana is served by a back end server. This controls which port to use.
server.port: 5601
# The host to bind the server to.
server.host: "localhost"
# If you are running kibana behind a proxy, and want to mount it at a path,
# specify that path here. The basePath can't end in a slash.
# server.basePath: ""
# The Elasticsearch instance to use for all your queries.
elasticsearch.url: "http://localhost:9200"

5. As soon as we start Logstash, it will start collecting all the events and index them to elasticsearch. Now lets run Kibana by executing kibana.bat in bin directory.

>bin\kibana.bat
log [15:02:40.715] [info][status][plugin:kibana] Status changed from uninitialized to green - Ready
log [15:02:40.762] [info][status][plugin:elasticsearch] Status changed from uninitialized to green - Ready
log [15:02:40.784] [info][status][plugin:kbn_vislib_vis_types] Status changed from uninitialized to green - Ready
log [15:02:40.794] [info][status][plugin:markdown_vis] Status changed from uninitialized to green - Ready
log [15:02:40.802] [info][status][plugin:metric_vis] Status changed from uninitialized to green - Ready
log [15:02:40.808] [info][status][plugin:spyModes] Status changed from uninitialized to green - Ready
log [15:02:40.825] [info][status][plugin:statusPage] Status changed from uninitialized to green - Ready
log [15:02:40.835] [info][status][plugin:table_vis] Status changed from uninitialized to green - Ready
log [15:02:40.867] [info][listening] Server running at http://localhost:5601
6. Once Kibana is started, we can open its interface in browser using http://localhost:5601/. We need to configure index pattern. By default, it shows logstash-* in index name or pattern field and @timestamp in Time-Field name. Hit create by keeping defaults.

7. Now click Discover on the top menu to see the results. It will show histogram of counts of event.

All the results are also listed below histogram but for security reasons, I have not shown them here..

That's it. Pretty clean, what you say?


Thursday, July 30, 2015

[Forms] Detect enter key pressed using javascript/JQuery

Hi,

Sometimes, we need to detect when Enter key was pressed or some other key as per the requirement. I had a use case where I had a search box and it was not a form. User could put some content in search input text box and when user hits Enter, it should fetch data using ajax call. The code below detects when Enter key was pressed.

We will add a keypress handler to the document which will capture all events whenever a key is pressed. We need to check which one is Enter. The code for Enter is 13.

$(document).keypress(function(e) {
if(e.which == 13)
{
// do operations here.
if($('#searchText:focus').size()>0)
{
// ajax call to get results.
}
}
});

We also need to check if the focus was actually the search input text.

That's all what we need.

Until next time :)

Friday, July 3, 2015

[Apache] Creating virtual hosts with Apache Httpd web server.

Its common to see multiple project running on one machine. Sometimes, it looks great if every web application runs with its own domain name. I usually keep my hostname as rkg.test, and for projects xxxx.proj. If I have multiple projects lets say blog, forum etc. so I can keep them as blog.proj, forum.proj etc. Neat, isn't it.

Here are simple steps to make it work.
1. Open your hosts file - in C:\Windows\System32\drivers\etc\hosts and add your domain to it. Add the lines at the end of file like this.
127.0.0.1 blog.proj
127.0.0.1 forum.proj

2. Make sure httpd.conf file has virtual hosting enabled. Either add virtual hosts in the same httpd.conf file or put them in a different file and use that. Lets keep them separate in another file and keep this line in httpd.conf
Include conf/extra/httpd-vhosts.conf

3. If already not present, create a folder in conf/ named extra and create a httpd-vhosts.conf file.

4. Add a line for enabling NameVirtualHost
NameVirtualHost *:80

5.  Add a virtual host by adding these lines in httpd-vhosts.conf file
<VirtualHost *:80>
    ServerAdmin webmaster@blog.proj
    DocumentRoot "D:/pFiles/xampp/htdocs/blog.proj"
    ServerName blog.proj
    ErrorLog "logs/blog.proj-error.log"
    CustomLog "logs/blog.proj.log" common
</VirtualHost>

6. Similarly add one more for forum.proj
<VirtualHost *:80>
    ServerAdmin webmaster@forum.proj
    DocumentRoot "D:/pFiles/xampp/htdocs/forum.proj"
    ServerName forum.proj
    ErrorLog "logs/forum.proj-error.log"
    CustomLog "logs/forum.proj.log" common
</VirtualHost>

7. Restart your apache. And open blog.proj and forum.proj in your browser.

Simple enough. Isn't it?

Until next time, C ya. Enjoy.