Sony Arouje

a programmer's log

Configure MS Sql Server in Mac M1 via docker

leave a comment »

In this post, I will walk you through how to run MS Sql Server on Mac M1. Make sure docker desktop is installed on your machine.

I prefer using a Docker Compose file to run containers because it helps me avoid remembering all the environment variables, ports, and configurations.
To run SQL Server, I’ve created the following Docker Compose file:

version: '3.8'

services:
  sqldata:
    image: mcr.microsoft.com/mssql/server:2022-latest
    environment:
        - SA_PASSWORD=Pass@word
        - ACCEPT_EULA=Y
    container_name: sqlserver-tracer
    ports:
        - "1433:1433"
    volumes:
        - './data:/var/opt/mssql/data'
        - './log:/var/opt/mssql/log'
        - './secrets:/var/opt/mssql/secrets'

Now, run the following command to start the SQL Server container:

docker-compose up

The container will exit with following error code

This container is running as user mssql.
To learn more visit https://go.microsoft.com/fwlink/?linkid=2099216.
/opt/mssql/bin/sqlservr: Invalid mapping of address 0x400976b000 in reserved address space below 0x400000000000. Possible causes:
1) the process (itself, or via a wrapper) starts-up its own running environment sets the stack size limit to unlimited via syscall setrlimit(2);
2) the process (itself, or via a wrapper) adjusts its own execution domain and flag the system its legacy personality via syscall personality(2);
3) sysadmin deliberately sets the system to run on legacy VA layout mode by adjusting a sysctl knob vm.legacy_va_layout.

To fix the above error, follow these steps in Docker settings:

General -> check Use Virtualization framework
Features in development -> Use Rosetta for x86/amd64 emulation on Apple Silicon

Now, run the Docker Compose command again, and this time SQL Server should be up and running.

DB Management tool

Download Azure Data Studio

Create a new connection, keeping in mind that the server parameter should be set to something like localhost,1433, where 1433 is the port mentioned in the Docker Compose file.

  • User Name: sa
  • Password: specify the password configured in docker compose environment variable SA_PASSWORD
  • Leave the database as default.

Note: This tutorial assumes that you have Docker Desktop installed on your Mac. If you haven’t already, you can download it here. Additionally, make sure you have a basic understanding of Docker and its concepts, as we’ll be using Docker Compose to manage our SQL Server container.

Written by Sony Arouje

September 21, 2023 at 4:56 pm

Posted in Misc, Programming

Tagged with , ,

Compile and run FreeSWITCH in Raspberry pi

with 2 comments

In recent days, I am spending my free time learning SIP and RTP protocols. In order to progress with my learning, I decided to setup FreeSWITCH. As usual decided to use one of my RPI and compile the system from source. Compiling from source will give me some basic understanding of the binaries and its dependencies.

First task was to install all the dependencies, followed this link to set up the deb etc but I always get below error. I do not have much idea how to fix it. So ignored this step and decided to manually install the dependencies.

Hit:1 http://deb.debian.org/debian bullseye InRelease
Get:2 http://deb.debian.org/debian bullseye-updates InRelease [44.1 kB]
Hit:3 http://security.debian.org/debian-security bullseye-security InRelease
Hit:4 http://archive.raspberrypi.org/debian bullseye InRelease
Ign:5 https://freeswitch.signalwire.com/repo/deb/rpi/debian-release `lsb_release InRelease                                                                                            
Err:6 https://freeswitch.signalwire.com/repo/deb/rpi/debian-release `lsb_release Release                                                                                              
  404  Not Found [IP: 190.102.98.174 443]
Reading package lists... Done                                                                                                                                                         
E: The repository 'https://freeswitch.signalwire.com/repo/deb/rpi/debian-release `lsb_release Release' does not have a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.

Install dependencies

Did some google search and got some basic dependencies to compile the code, but had to install dependencies while doing configure and make step. Below is the full set of dependencies I installed to compile the FreeSWITCH source and dependable source code. If I missed any, please update it in the comment section.

sudo apt-get install build-essential
sudo apt-get install git-core build-essential autoconf automake libtool libncurses5 
sudo apt-get install libncurses5-dev make libjpeg-dev pkg-config unixodbc 
sudo apt-get install unixodbc-dev zlib1g-dev libtool-bin
sudo apt-get install libcurl4-openssl-dev libexpat1-dev libssl-dev sqlite3 
sudo apt-get install libsqlite3-dev libpcre3 libpcre3-dev libspeexdsp1 
sudo apt-get install libspeexdsp-dev libldns-dev libavformat-dev ffmpeg 
sudo apt-get install libedit-dev python3.9-distutilss cmake libswscale-dev 
sudo apt-get install liblua5.1-0-dev libopus-dev libpq-dev libsndfile-dev 
sudo apt-get install uuid uuid-dev

Compile source code dependencies

In order to compile FreeSWITCH, first we need to compile the below projects.

sofia-sip

cd /usr/src
sudo git clone https://github.com/freeswitch/sofia-sip
sudo ./bootstrap.sh
sudo ./configure
sudo make
sudo make install

spandsp

cd /usr/src 
sudo git clone https://github.com/freeswitch/spandsp
sudo apt-get install libtiff-dev
sudo ./bootstrap.sh
sudo ./configure
sudo make
sudo make install

libks

cd /usr/src
git clone https://github.com/signalwire/libks
cd libks
cmake .
make
make install

signalware-c

cd /usr/src
sudo git clone https://github.com/signalwire/signalwire-c
cd signalware-c
sudo cmake .
make
make install

Compile FreeSWITCH

Source code compilation will take some time, be patient.

cd /usr/src
sudo git clone git://git.freeswitch.org/freeswitch.git -bv1.10 freeswitch
cd freeswitch
sudo ./bootstrap.sh
sudo ./configure
sudo make 
sudo make install 
sudo make cd-sounds-install cd-moh-install

Once all the above steps are completed, you can go through the post install steps. I only done the owner permission step, as I intent to run from a command line and not as a daemon.

Written by Sony Arouje

February 26, 2023 at 5:18 pm

Switching from Windows to MacBook

leave a comment »

I was an avid user of Windows for over 20+ years. My windows machine is old and may give up on me any time soon, so I started looking for a powerful machine for my development needs. I almost fixed on a dell laptop but I also started looking at MacBook as well. At the end decided to shift to Mac after seeing the reviews about the performance of M1 chip. Another reason is, I can run Visual studio in mac and thats one of the important requirement for me, VS 2022 is still a preview release for mac but it will stable soon. I am using iPhone for so many years and I know the basics of how apple’s system works.

Ordered a MacBook Pro M1 Max with 64gb, as its a custom made waited for almost a month and the seller told me that the delivery will take more time than they expected, they offered me a 32gb version instead. I did some research to see any performance impact if I downgrade to 32gb. Come across some youtube videos where people are stress testing both 32 and 64gb version and I could see not much difference between both configuration. So decided to go ahead with 32gb.

Getting started with mac

Initially it was difficult. I didn’t had any clue how to access the windows explorer equivalent in mac. Next hurdle was with the keyboard shortcuts. I started googling to understand the mac os and the shortcuts, at the end, got a fair idea of how the system works.

Got to know about finder, launchpad, spotlight etc works, these are some of the basics applications of mac os I experimented first. Next learning was how to access applications that are running. I had to learn more about using trackpad, those three finger and two finger gestures. I was so amazed how the trackpad works, its very convenient and so very responsive. I never experienced such a trackpad in any windows based laptops.

Setting up the system

I am able to setup my development system without much of a hurdle, I mostly use VS Code, Visual studio, docker, nodejs, golang etc. I didn’t had any trouble setting up any of the application I use on a daily basis.

Performance of the system

There are so many benchmarks available in internet and I am not going repeat it but MacBook pro is blazingly fast. I haven’t experienced any kind of performance delay even when the memory utilisation has reached close to 28gb. Its one of the fastest laptop I ever used so far. System startup is quick just like a smart phone. Battery life is very impressive, I haven’t tested the full duration but I could easily work for 6-7 hrs without plugging. I have been using this mac for couple of days and never heard the sound of the cooling fan.

The retina display is one of the best, its so crisp and clear. I can read even if I am a little away from the display.

Conclusion

I thoroughly enjoy the experience of using MacBook. Its build quality is superior. Initially started with my logitech wireless mouse but soon realised, its easy to use built in trackpad than mouse. I don’t know yet how to quickly access running programs and other options trackpad provides via mouse, switched off the mouse and started using trackpad. So far I am really satisfied with my MacBook.

I would like to say, I love the keyboard short cuts in windows than mac. For e.g. press End or Home button to move to the end of line or go back to the start of line and other shortcuts. Hope by practice I will be more comfortable with Mac keyboard.

I have been using Mac for less than a week, I will update this blog post once I use Mac for some more time.

Written by Sony Arouje

June 15, 2022 at 1:52 pm

Posted in .NET

Tagged with , ,

Go plugin file size optimization

leave a comment »

Now a days most of my pet projects are developed in golang. Recently I started working on a project to read data from some sensors to do some automations at my farm. The program will be running in a raspberry pi. I designed the system to utilize plugin architecture, this approach allow me to expand the functionality with ease. When I start adding plugins, I realized that the plugin binary size is more than I expected. As we have fast internet connectivity these size will not cause much harm but when the system is deployed in a place where internet connections are really slow then the file size really matters. This blog is about how I managed to reduce the plugin file size.


If one designing a pluggable system then real care should be given to the plugin code base. Design the plugin in such a way to reduce import of packages. Golang plugins are self sufficient like any other go binary. So what ever the packages imported will add size to the plugin. To make thing more clear, I wrote a small application which loads several plugins and print hello world.

Lets write some code

Here is the sample plugin looks like

import "fmt"

type Plugin struct {
}

func (d Plugin) Write() {
	fmt.Println("Hello world")
}

var P Plugin

Like above created another plugin to write hello world in German

Build the plugin 

go build -buildmode=plugin -o writers/plugins/en.so writers/plugins/en/en.go
go build -buildmode=plugin -o writers/plugins/de.so writers/plugins/de/de.go

Now lets examine the size of the plugin

ls -lh ./writers/plugins/*.so
-rw-r--r-- 1 pi pi 3.3M Apr 25 13:57 ./writers/plugins/de.so
-rw-r--r-- 1 pi pi 3.3M Apr 25 13:57 ./writers/plugins/en.so

As you can see each plugin is 3.3mb.
Build again this time let’s use ldflags

go build -ldflags="-s -w" -buildmode=plugin -o writers/plugins/en.so writers/plugins/en/en.go

Check the size again

ls -lh ./writers/plugins/*.so
-rw-r--r-- 1 pi pi 3.3M Apr 25 14:28 ./writers/plugins/de.so
-rw-r--r-- 1 pi pi 2.4M Apr 25 14:28 ./writers/plugins/en.so

Building with ldflags reduces the size of en.so plugin by 1mb


Lets run upx to this binary

sudo apt-get install upx
chmod +x ./writers/plugins/en.so
upx -9 -k ./writers/plugins/en.so

Check the file size again

ls -lh ./writers/plugins/*.so
-rw-r--r-- 1 pi pi 3.3M Apr 25 14:28 ./writers/plugins/de.so
-rwxr-xr-x 1 pi pi 2.1M Apr 25 14:28 ./writers/plugins/en.so

Running upx reduce the size by 0.2mb

This is the max reduction I can get with different build optimization.

Refactor the code

This is where we need to redesign the plugins and should keep refactoring the code to reduce package imports.

Where this size of plugin comes from? Its the import of fmt package. If I comment fmt.Println and build using ldflags and running upx will reduce the plugin size to 893k

ls -lh ./writers/plugins/*.so
-rw-r--r-- 1 pi pi 3.3M Apr 25 14:49 ./writers/plugins/de.so
-rwxr-xr-x 1 pi pi 893K Apr 25 14:49 ./writers/plugins/en.so

So how to keep the file size optimum and achieve the result we need. Interface comes to our rescue.

Lets create an interface, this is just a sample code and not following any naming conventions here

type Plugger interface {
	Print(a ...interface{})
}

Every plugin should now relay on this interface to print hello world. See the refactored en plugin

type Plugin struct {
}

func (d Plugin) Write(plugger writers.Plugger) {
	plugger.Print("Hello world")
}

var P Plugin

Here is the method that satisfies Plugger interface. This function should be outside of plugins package

import (
	"fmt"
)
type PluginUtil struct {
}

func NewPluginUtils() PluginUtil {
	return PluginUtil{}
}

func (p PluginUtil) Print(a ...interface{}) {
	fmt.Println(a...)
}

Check the size of plugin again

-lh ./writers/plugins/*.so
-rw-r--r-- 1 pi pi 1.5M Apr 25 15:11 ./writers/plugins/de.so
-rwxr-xr-x 1 pi pi 897K Apr 25 15:11 ./writers/plugins/en.so

Source code: https://github.com/sonyarouje/goplugin

Written by Sony Arouje

April 25, 2021 at 8:30 pm

Posted in .NET

Tagged with , , ,

Expo react-native development in Docker

with 7 comments

I spent most of my free time learning developing applications in different platforms. Recently I was spending time in Expo, a platform to build react-native apps. Expo is a pretty good platform to kick start your react native development. One of the difficulty I always face is upgrading the versions, for e.g. some times expo releases multiple updates in a month. When upgrading in my Windows machine, there could be issues, either a file lock or some thing else. These installations issues leads to frustrations and fire fighting to return to a working state. Recently my friend Sendhil told me, how he use VS Code to remote develop using containers. I decided to take a look at it.

I kept myself away from docker for some time. Decided to try out docker again. It took me few mins to up and running a docker image maintained by node. Next step was to install the expo-cli and other dependencies to run my expo test application. I had to over come several errors popped up when running expo code in a container. Spend hours reading forums and posts to resolve it one by one. Here is the Dockerfile I came up, which can be used to develop any expo based applications.

The below workflow holds good for any kind of node or react or react-native, etc developments.

Dockerfile

FROM node:10.16-buster-slim
LABEL version=1.0.0

ENV USERNAME dev
RUN useradd -rm -d /home/dev -s /bin/bash -g root -G sudo -u 1005 ${USERNAME}

EXPOSE 19000
EXPOSE 19001
EXPOSE 19002

RUN apt update && apt install -y \
    git \
    procps

#used by react native builder to set the ip address, other wise 
#will use the ip address of the docker container.
ENV REACT_NATIVE_PACKAGER_HOSTNAME="10.0.0.2"

COPY *.sh /
RUN chmod +x /entrypoint.sh \
    && chmod +x /get-source.sh

#https://github.com/nodejs/docker-node/issues/479#issuecomment-319446283
#should not install any global npm packages as root, a new user 
#is created and used here
USER $USERNAME

#set the npm global location for dev user
ENV NPM_CONFIG_PREFIX="/home/$USERNAME/.npm-global"

RUN mkdir -p ~/src \
    && mkdir ~/.npm-global \
    && npm install expo-cli --global

#append the .npm-global to path, other wise globally installed packages 
#will not be available in bash
ENV PATH="/home/$USERNAME/.npm-global:/home/$USERNAME/.npm-global/bin:${PATH}"

ENTRYPOINT ["/entrypoint.sh"]
CMD ["--gitRepo","NOTSET","--pat","NOTSET"]

 

VS Code to develop inside a container

To enable VS Code to develop inside a container, we need to Install Remote Development Extension pack. Here is the more detailed write up from MS

To enable remote development we need two more files in our source folder.

  • docker-compose.yml
  • devcontainer.json

docker-compose.yml

version: '3.7'

services:
  testexpo:
    environment:
      - REACT_NATIVE_PACKAGER_HOSTNAME=10.0.0.2
    image: sonyarouje/expo-buster:latest
    extra_hosts:
      - "devserver:10.0.0.2"
    command: "--gitRepo sarouje.visualstudio.com/_git/expotest --pat z66cu5tlfasa7mbiqwrjpskia"
    expose:
      - "19000"
      - "19001"
      - "19002"
    ports:
      - "19000:19000"
      - "19001:19001"
      - "19002:19002"
    volumes:
      - myexpo:/home/node/src
volumes:
  myexpo:

 

  • REACT_NATIVE_PACKAGER_HOSTNAME: Will tell react-native builder to use the configured ip when exposing the bundler, else will use the docker container’s ip and will not be able to access from your phone.
  • command: Specify your git repo to get the source code and the pat code. When running docker-compose up, docker container will use these details to clone your repo to /home/dev/src directory of the container.
  • volumes: Containers are short lived and stopping the container will loose you data. For e.g. once the container is up we might install npm packages. If the packages are not able to persist then we need to reinstall packages every time we start the container. In order to persist the packages and changes, docker-compose creates a named volume and keep the files of /home/dev/src in the volume and can be accessible even after docker restart.

Keep in mind ‘docker-compose down’ will remove the volume and we need to reinstall all the packages again.

devcontainer.json

Create a new folder named .devcontainer and inside the new folder create a file named devcontainer.json. Below is the structure of the file.

{
    "name": "expo-test",
    "dockerComposeFile": "../docker-compose.yml",
    "service": "testexpo",
    "workspaceFolder": "/home/dev/src/expotest",
	"extensions": [
       "esbenp.prettier-vscode"
    ]
    "shutdownAction": "stopCompose"
}
  • dockerComposeFile: will tell where to find the docker-compose.yml file
  • service: Service configured in docker-compose.yml file
  • workspaceFolder: Once VS Code attached to the container, will open this workspace folder.

    extensions: Mention what all the extensions need to be installed in VS Code running from the container.

Work flow

  • Download the latest version of docker
  • Open powershell/command prompt and run ‘docker pull sonyarouje/expo-buster’
  • Open your source folder and create docker-compose.yml and .devcontainer/devcontainer.json file
  • Modify docker-compose.yml and give the git repo and pat, etc
  • Open VS Code in source folder. VS Code will prompt to Reopen in Container, click Reopen in Container button. Wait for some time, and VS Code will launch from the container.
  • Once launched in container, all your code changes will be available only in the container. Make sure to push your changes to git before exiting the container.

Advantages of containerized approach

We can spawn a new container at ease and test our code against any new version of libraries we are using. We don’t need to put our dev machine at risk. Any break or compilation issues, we can always destroy the container and go back to the dev container and proceed with our development. No need to restore our dev machine to a working state. If the upgrade succeed then we can always destroy the current dev container and use the new container as the development container. No more hacking with our current working container.

Where is the source?

 

All the dockerfiles and scripts are pushed to git. Feel free to fork it or give me a pull request in case of any changes. I created two versions of docker file, one for alpine and one for buster. As of now stable VS Code release wont support alpine but you can always switched to VSCode insider build to use alpine.

Docker image is published to docker hub, can be pulled using sonyarouje/expo-buster or sonyarouje/expo-buster:3.0.6. Here 3.0.6 is the version of expo-cli.

 

Written by Sony Arouje

August 2, 2019 at 7:18 pm

Posted in .NET

React-Native library for Azure AD B2C

with 37 comments

Last couple of days I was spending my free time learning Azure Functions and Authentication of these functions. I was concentrating on Azure Active Directory B2C as my authentication provider. May be I will write another post detailing how to setup Azure AD B2C and configuring Azure Functions.

I could able to create access tokens and access my functions via postman. Next I wanted to create a react-native mobile app and try access the same function from it. I searched for libraries to enable AD B2C login, unfortunately I couldn’t find any. I tried MSAL js but will not work with react-native, as MSAL needs localStorage or sessionStorage to work and is not suitable for react-native world.

Fortunately I found a library which does Azure AD login. The work flow of Azure AD and AD B2C is almost same. So I decided to use Azure AD library and add Azure AD B2C functionality. I trim down the Azure AD library and removed the option to store the tokens. Caching of the token should be handled by the caller. This library will do the login flow and return back the tokens, it can also refresh the access_token using the refresh token flow.

Lets see how to use this library.

import B2CAuthentication from "../auth-ad-js/ReactNativeADB2C"; import LoginView from "../auth-ad-js/LoginView"; const CLIENT_ID = "<provide your client id>"; export default class LoginScreen extends React.Component { static navigationOptions = { title: "Login" }; render() { const b2cLogin = new B2CAuthentication({ client_id: CLIENT_ID, client_secret: "<key set in application/key>", user_flow_policy: "B2C_1_signupsignin", token_uri: "https://saroujetmp.b2clogin.com/saroujetmp.onmicrosoft.com/oauth2/v2.0/token", authority_host: "https://saroujetmp.b2clogin.com/saroujetmp.onmicrosoft.com/oauth2/v2.0/authorize", redirect_uri: "https://functionapp120190131041619.azurewebsites.net/.auth/login/aad/callback", prompt: "login", scope: ["https://saroujetmp.onmicrosoft.com/api/offline_access", "offline_access"] }); return ( <LoginView context={b2cLogin} onSuccess={this.onLoginSuccess.bind(this)} /> ); } onLoginSuccess(credentials) { console.log("onLoginSuccess"); console.log(credentials); // use credentials.access_token.. } }

Parameters passing to B2CAuthentication are the values I configured in Azure AD B2C. If the login succeeds then onLoginSuccess callback function will receive the tokens.

The library is hosted in git.

Written by Sony Arouje

February 7, 2019 at 4:13 pm

Posted in React

Tagged with ,

simdb a simple json db in GO

leave a comment »

Some days ago I decided to learn GO. GO is pretty easy to learn and could learn syntax and semantics in couple of hours. To completely learn a language I normally write a small app in that language. So in my free time, I rewrote the expense service created in nodejs to GO and is now live and we are using it. This whole exercise allow me to learn GO in detail.

For me GO looks to be a great simple language with static type checking. Seems like I will be using GO for my future RPi projects than nodejs. In RPi more often I use a simple json as a db to store, retrieve and update execution rules, Sensor details, etc. In nodejs I use tingodb, couldn’t find some thing very similar in GO, so decided to write one, and is called simdb, a simple json db.

Using simdb I can persist stuct or retrieve or update or delete them from the json db. The db file created by simdb is a simple json file. Let’s see some of the functions in simdb.

 

Create a new instance of db

driver, err:=db.New("customer")

Insert a new Customer to db

customer:=Customer { CustID:"CUST1", Name:"sarouje", Address: "address", Contact: Contact { Phone:"45533355", Email:"someone@gmail.com", }, } err=driver.Insert(customer) if(err!=nil){ panic(err) }

Get a Customer

var customerFirst Customer err=driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) if(err!=nil){ panic(err) }

Update a customer

customerFirst.Name="Sony Arouje" err=driver.Update(customerFirst) if(err!=nil){ panic(err) }

Delete a customer

toDel:=Customer{ CustID:"CUST1", } err=driver.Delete(toDel) if(err!=nil){ panic(err) }

Update and Delete operation uses the ID field of the struct to perform it’s operation.

 

Let’s see the full code.

package main import ( "github.com/sonyarouje/simdb/db" "fmt" ) type Customer struct { CustID string `json:"custid"` Name string `json:"name"` Address string `json:"address"` Contact Contact } type Contact struct { Phone string `json:"phone"` Email string `json:"email"` } //ID any struct that needs to persist should implement this function defined //in Entity interface. func (c Customer) ID() (jsonField string, value interface{}) { value=c.CustID jsonField="custid" return } func main(){ fmt.Println("starting....") driver, err:=db.New("dbs") if(err!=nil){ panic(err) } customer:=Customer { CustID:"CUST1", Name:"sarouje", Address: "address", Contact: Contact { Phone:"45533355", Email:"someone@gmail.com", }, } //creates a new Customer file inside the directory passed as the //parameter to New(). If the Customer file already exist //then insert operation will add the customer data to the array err=driver.Insert(customer) if(err!=nil){ panic(err) } //GET ALL Customer //opens the customer json file and filter all the customers with name sarouje. //AsEntity takes an address to Customer array and fills the result to it. //we can loop through the customers array and retireve the data. var customers []Customer err=driver.Open(Customer{}).Where("name","=","sarouje").Get().AsEntity(&customers) if(err!=nil){ panic(err) } // fmt.Printf("%#v \n", customers) //GET ONE Customer //First() will return the first record from the results //AsEntity takes the address to Customer variable (not an array pointer) var customerFirst Customer err=driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) if(err!=nil){ panic(err) } //Update function uses the ID() to get the Id field/value to find the record and update the data. customerFirst.Name="Sony Arouje" err=driver.Update(customerFirst) if(err!=nil){ panic(err) } driver.Open(Customer{}).Where("custid","=","CUST1").First().AsEntity(&customerFirst) fmt.Printf("%#v \n", customerFirst) // Delete toDel:=Customer{ CustID:"CUST1", } err=driver.Delete(toDel) if(err!=nil){ panic(err) } }

TODO

The query syntax in simdb is not really great, I need to find a better approach.

 

Source Code: https://github.com/sonyarouje/simdb

Written by Sony Arouje

August 6, 2018 at 2:30 pm

Posted in GO

Tagged with , , ,

Slack App to track our expenses hosted in a Raspberry Pi

with one comment

At 4Mans Land we keep struggling to track our expenses. We normally note the expenses in Whatsapp and later move to a shared excel sheet. But this is always have difficulties, there will be lot of noises in the whatsapp with ongoing discussions. We also use slack for discussions, so I decided to integrate an expense app and evaluated some. Most of them are paid and don’t want to pay for an app at this stage. One night decided to write an app for slack and by morning finished the basic app that can store expenses in a mongodb. This whole experiment started as a fun to play with some thing new and also understand the slack api.

Wrote the server in nodejs and choose mongo db for persistence. For testing the slack integration, I used ngrok to create a local tunnel. also evaluated localtunnel.me which is a free version, but very unstable. ngrok is good but we have to pay $5 per month for license.  Started evaluating other alternatives to host my server, heroku was another alternative and used it several times earlier. At the moment we wanted to spend less for hosting and infrastructure. At last decided to host the server at my home, I have a really good stable internet connection with more than 80mbps speed and a Gigabit router. Added a port forwarding at my router,  expected to access the port immediately but to my surprise it’s not working. Done port forwarding so many times earlier and could access stuffs inside my network without any issues, my ISP was different that time. Then called up my ISP support, they informed me that without a static ip I wont be able to do port forwarding, for static IP I have to pay a very minimal fee and is life long unless I switch the provider. Paid the amount and in 2 days got my static IP and could do port forwarding successfully. Also set up dynamic dns at noip.com to give a good name to my IP. With all these settings done, changed the url in slack app and used the one I setup at noip.com. Run the nodejs server at my dev machine and fired a command from slack app, viola received the command to the server running in my laptop. The server is ready and is running in my laptop, but required a system which could run 24/7. The cheap option came to my mind was to host the nodejs server in one of the Raspberrypi lying in my table, one night I decided to setup the pi with the latest Raspbian stretch.

Setting up the Raspberry pi

Expected to finish the whole setup in less than an hour, as did this several times earlier but not that day. After installing the stretch, apt-get update started throwing hash sum error, reinstalled the os couple of times. Same error again and again. This error sent me for a wild goose chase, reading one forum after another and tried all the suggestions but nothing changed. At last came across this stackoverflow post which fixed the issue and could update the os, phew.

Next installing the nodejs, used nodejs version maintained here and issue the ubuntu command, for e.g. to install nodejs 10 issue the below command from your raspberry pi

curl sL https://deb.nodesource.com/setup_10.x | sudo -E bash – sudo aptget install y nodejs

Next task is installing Mongodb, apt-get install mongodb installed an earlier version of mongodb and is not suitable for the mongoose orm which I am using in the code, either I have to use an older version of mongoose or try to find a way to install a new version of Mongodb. I choose the later and try hunting for some clues to install some newer version of mongodb. At the end, come accross this post, which has all the details of installing mongodb in Rpi jessie, which is good enough in stretch as well, also he provided a link for stretch mongo binaries. Followed the instructions and at the end mongodb started running in my machine.

When I try to connect from Compass to rpi’s mongodb instance, it’s not getting connected. Reading through some docs and forums realized that I have to comment bind_ip in /etc/mongodb.conf, commented out that config and able to connect from compass.

Before going to bed at 2am, copied the nodejs code to rpi, did some testing and all went well. Could post expenses from slack to the service running in my pi. Went to bed with peace of mind.

What this app does?

The main focus of this app is to capture the expense, for now it uses slack command to forward text to the service. Initially created a strict pattern to record expense for e.g.

/expense spent 4000 INR on Jill to fetch a pail of water by Jack

Here spent should be first word followed by amount, after ‘on’ specify the description and after ‘by’ specified who paid the amount. The nodejs service will parse them and place them in separate fields in mongodb collection. Then to query the expense, some thing like

/expense query date today

/expense query date 05/12/2018 – 05/14/2018

So here ‘spent’ and ‘query’ is a keyword, based on this keyword different handling logics will kicks in and complete the task.

I felt the syntax is more restrictive, started analyzing options to process the sentence more naturally. I come across natural, a NLP library for nodejs. Using natural’s BayesClassifier trained the system to categorize the input text and derive the right keyword, which again calls the different logic and get the task done. After the classification training, the system can take inputs like

/expense Jack paid 4000 by Card for ‘Jill to fetch a pail of water’ on 05/16/2018

The above code will classified as spent, then added some code to extract relevant part from the text. Not pure NLP approach, any thing inside single quotes will be considered as expense. Any date in the text is considered as the payment date and so on. I am learning NLP, in future I might achieve a better translation of text.

Querying command is modified as shown below

/expense today

/expense between 05/12/2018 – 05/14/2018

Can also query and return expenses in csv format

/expense today csv  will return the expense in comma seperated format.

Slack output

image

After each expense created successfully, it returns the expense with the ID and the actual text we passed to the expense command. For e.g to create first expense passed the text like

/expense Jack paid 4000 to ‘Jill to fetch a pail of water’

Here is the output for querying todays expense, issue a command like

/expense today

image

 

In csv format (/expense today csv)

image

We can also delete an expense, only the expense you created can be deleted.

/expense delete today: delete all the expense you entered today

/expense delete 116: delete the expense with id 116

Had a great fun during this experiment and learned few things.

Happy coding…

Written by Sony Arouje

May 18, 2018 at 5:17 pm

react, redux and cycle – Introduction

leave a comment »

Last couple of months we were working mostly in react js to develop a flexible platform for one of the product we are working. I am planning to make a couple of post to explain how to start developing in react with redux and cycle. To setup my dev environment, I used the react boilerplate created by my friend Sendhil. This is a basic template to jumbstart react js development with linting.

I am not going in depth of redux or redux-cycle, it’s just an attempt to help setup the initial phase of the development. If you are not familiar with redux or why we need redux, better learn it from the creator.

Let’s use the template and incrementally add more features to it. I created a new github repository with some initial code to start with.

Below are the new npm dependencies added to package.json

  • react-redux: "^5.0.4",
  • redux: "^3.6.0",
  • @cycle/http: "^13.3.0",
  • @cycle/run: "^3.1.0",
  • @cycle/time: "^0.8.0",
  • redux-cycles: "^0.4.1",
  • xstream: "^10.9.0",
  • prop-types: "^15.5.10"

For any redux application we need to setup a store. I created a store as well, see the code below.

import { applyMiddleware, createStore, compose } from 'redux'; import { createCycleMiddleware } from 'redux-cycles'; import { run } from '@cycle/run'; import { makeHTTPDriver } from '@cycle/http'; import { timeDriver } from '@cycle/time'; import logger from 'redux-logger'; // combined all the reducers in the application import reducers from './reducers'; // combined all the cycles in the application import cycles from './cycles'; //create cycle middleware to attach to redux store. const cycleMiddleware = createCycleMiddleware(); const { makeActionDriver, makeStateDriver } = cycleMiddleware; //we might use multiple middleware, here we used a logger //and cycle. We can add more middleware by adding them to //the below array. const middleWare = [ logger, cycleMiddleware, ]; const initState = {}; // more about compose here http://redux.js.org/docs/api/compose.html let composeEnhancers = compose; // adding redux dev tool to visualize the store state. // should be enabled only in development. if (process.env.NODE_ENV !== 'production') { const composeWithDevToolsExtension = window.__REDUX_DEVTOOLS_EXTENSION_COMPOSE__; if (typeof composeWithDevToolsExtension === 'function') { composeEnhancers = composeWithDevToolsExtension; } } const store = createStore( reducers, // all the available reducers combined initState, // initial state of the reducers composeEnhancers( // adding store enchancers applyMiddleware(...middleWare), // attaching the middleware ), ); // calling cycle's run() we are activating the cycles that we created // here all the different cycles are combined to one. run(cycles, { ACTION: makeActionDriver(), STATE: makeStateDriver(), Time: timeDriver, HTTP: makeHTTPDriver(), }); export default store;

 

Added some inline comments to the code and hopefully it’s pretty self explanatory. Once we have the store created, we need to pass down the store to other components, so that they can access state or dispatch actions.

Let’s edit index.jsx and add the below code.

import React from 'react'; import { render } from 'react-dom'; import { Provider } from 'react-redux'; import store from './store/create-store'; import App from './App'; render( <Provider store={store}> <App /> </Provider> , document.getElementById('root'));

 

Great, we created a basic react app with redux. In next post I will cover how to create reducer and handle side effects using redux-cycle.

 

Happy coding…

Written by Sony Arouje

September 22, 2017 at 6:32 pm

Posted in JavaScript, React

Tagged with , ,

RF Communication using nrf24L01 and Nodejs addon

leave a comment »

Recently I started experimenting with radio communication with low cost nrf24L01 modules. These modules are very cheap compare to XBee modules which I used earlier. With these nrf24 modules we could enable wireless communication between Arduinos and Raspberry pi very effectively and economically. For my experiment I used two nrf24 modules, one connected to an Arduino Uno and another to a Raspberry pi 1.  Here is the pin connection details

Seq NRF24L01 RPi Arduino Uno
1 GND 25 (GND) GND
2 VCC 17 (3.3v) 3.3v
3 CE 15 7
4 CSN 24 8
5 SCK 23 13
6 MOSI 19 11
7 MISO 21 12
8 IRQ

 

For testing the communication, I used the RF24Network library, which is very good and has good documentation. Also it comes with e.g for both Arduino and RPi. So I didn’t write any single code just used the e.g and able to see the communication working, initially I had some troubles and at the end every thing worked well, I can see the data coming from Arduino in RPi. 

My intention is to use these modules in RPi and write code in nodejs. Unfortunately there is no nodejs support for this library. So last night I decided to write a nodejs addon for this C/C++ library. I didn’t had any experience in writing a nodejs addon, I spend an hour understanding the Nan and creating very simple addons. Then I started writing the addon for RF24Network, this task was very hard than trying with simple hello world addons.

Node-gyp was keep on failing when it tries to compile the RFNetwork modules. In my searches I realized that node-gyp uses make utility and I need to add the C/C++ files of this library. At the end I could compile the node addon. See the binding.gyp file

{ "targets": [ { "target_name": "nrf24Node", "sources": [ "nrf24Node.cc", "RF24/RF24.cpp", "RF24/utility/RPi/spi.cpp", "RF24/utility/RPi/bcm2835.c", "RF24/utility/RPi/interrupt.cpp", "RF24Network/RF24Network.cpp", "RF24Network/Sync.cpp" ], "include_dirs": [ "<!(node -e \"require('nan')\")", "RF24Network", "RF24" ], "link_settings": { "libraries": [ "-RF24", "-RFNetwork" ] } } ] }

 

I should say, I am just a beginner in node-gyp and this binding.gyp might need some improvements. Anyway with this gyp file, the compilation succeeded.

Next is to create the addon file. Here I had to learn more about the data types of Nan and Callbacks. I started simple functions to begin with and compile again, then moved on to next. I took more time in understanding callbacks which allows the addon to call javascript callback functions. Also spend a lot of time in understanding threading and creating a module to continuous listening of incoming messages and trigger the callback function, so that nodejs can process those incoming messages. I use libuv for threading, it seems more easy to understand than Async worker modules in Nan.

That whole night I spend learning and writing and refactoring the addon, I finished the module by early morning. By that time I could write a nodejs app and could listen to incoming messages.

Here is the sample code in node js to listen and acknowledge the message back to the sender.

var rf24 = require('./build/Release/nrf24Node.node'); rf24.begin(90,00); rf24.printDetails(); rf24.write(1,"Ack"); rf24.readAsync(function(from, data){ console.log(from); console.log(data); rf24.write(from,"Ack"); }); process.on('SIGINT', exitHandler); function exitHandler() { process.exit(); rf24.close(); }

 

Here is the complete addon. The code is uploaded to github, with the steps to compile and use it your own nodejs applications.

#include <nan.h> #include <v8.h> #include <RF24.h> #include <RF24Network.h> #include <iostream> #include <ctime> #include <stdio.h> #include <time.h> #include <string> using namespace Nan; using namespace v8; RF24 radio(RPI_V2_GPIO_P1_15, BCM2835_SPI_CS0, BCM2835_SPI_SPEED_8MHZ); RF24Network network(radio); Nan::Callback *cbPeriodic; uv_async_t* async; struct payload_t { // Structure of our payload char msg[24]; }; struct payload_pi { uint16_t fromNode; char msg[24]; }; //-------------------------------------------------------------------------- //Below functions are just replica of RF24Network functions. //No need to use these functions in you app. NAN_METHOD(BeginRadio) { radio.begin(); } NAN_METHOD(BeginNetwork){ uint16_t channel = info[0]->Uint32Value(); uint16_t thisNode = info[0]->Uint32Value(); network.begin(channel,thisNode); } NAN_METHOD(Update) { network.update(); } NAN_METHOD(Available) { v8::Local<v8::Boolean> status = Nan::New(network.available()); info.GetReturnValue().Set(status); } NAN_METHOD(Read) { payload_t payload; RF24NetworkHeader header; network.read(header,&payload,sizeof(payload)); info.GetReturnValue().Set(Nan::New(payload.msg).ToLocalChecked()); } //-------------------------------------------------------------------------------- NAN_METHOD(Begin){ if (info.Length() < 2) return Nan::ThrowTypeError("Should pass Channel and Node id"); uint16_t channel = info[0]->Uint32Value(); uint16_t thisNode = info[1]->Uint32Value(); radio.begin(); delay(5); network.begin(channel, thisNode); } NAN_METHOD(Write){ if (info.Length() < 2) return Nan::ThrowTypeError("Should pass Receiver Node Id and Message"); uint16_t otherNode = info[0]->Uint32Value(); v8::String::Utf8Value message(info[1]->ToString()); std::string msg = std::string(*message); payload_t payload; strncpy(payload.msg, msg.c_str(),24); RF24NetworkHeader header(otherNode); bool ok = network.write(header,&payload, sizeof(payload)); info.GetReturnValue().Set(ok); } void keepListen(void *arg) { while(1) { network.update(); while (network.available()) { RF24NetworkHeader header; payload_t payload; network.read(header,&payload,sizeof(payload)); payload_pi localPayload; localPayload.fromNode = header.from_node; strncpy(localPayload.msg, payload.msg, 24); async->data = (void *) &localPayload; uv_async_send(async); } delay(2000); } } void doCallback(uv_async_t *handle){ payload_pi* p = (struct payload_pi*)handle->data; v8::Handle<v8::Value> argv[2] = { Nan::New(p->fromNode), Nan::New(p->msg).ToLocalChecked() }; cbPeriodic->Call(2, argv); } NAN_METHOD(ReadAsync){ if (info.Length() <= 0) return Nan::ThrowTypeError("Should pass a callback function"); if (info.Length() > 0 && !info[0]->IsFunction()) return Nan::ThrowTypeError("Provided callback must be a function"); cbPeriodic = new Nan::Callback(info[0].As<Function>()); async = (uv_async_t*)malloc(sizeof(uv_async_t)); uv_async_init(uv_default_loop(), async, doCallback); uv_thread_t id; uv_thread_create(&id, keepListen, NULL); uv_run(uv_default_loop(), UV_RUN_DEFAULT); } NAN_METHOD(PrintDetails) { radio.printDetails(); } NAN_METHOD(Close){ uv_close((uv_handle_t*) &async, NULL); } NAN_MODULE_INIT(Init){ Nan::Set(target, New<String>("beginRadio").ToLocalChecked(), GetFunction(New<FunctionTemplate>(BeginRadio)).ToLocalChecked()); Nan::Set(target, New<String>("beginNetwork").ToLocalChecked(), GetFunction(New<FunctionTemplate>(BeginNetwork)).ToLocalChecked()); Nan::Set(target, New<String>("update").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Update)).ToLocalChecked()); Nan::Set(target, New<String>("printDetails").ToLocalChecked(), GetFunction(New<FunctionTemplate>(PrintDetails)).ToLocalChecked()); Nan::Set(target, New<String>("available").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Available)).ToLocalChecked()); Nan::Set(target, New<String>("read").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Read)).ToLocalChecked()); Nan::Set(target, New<String>("readAsync").ToLocalChecked(), GetFunction(New<FunctionTemplate>(ReadAsync)).ToLocalChecked()); Nan::Set(target, New<String>("write").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Write)).ToLocalChecked()); Nan::Set(target, New<String>("close").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Close)).ToLocalChecked()); Nan::Set(target, New<String>("begin").ToLocalChecked(), GetFunction(New<FunctionTemplate>(Begin)).ToLocalChecked()); } NODE_MODULE(nrf24Node, Init)

All the credit goes to the developers of RF24 and RF24Network library, I just created an addon for the great library. Along the way I learned a lot and could finish the nodejs addon.

 

Happy coding…

Written by Sony Arouje

February 5, 2017 at 4:57 pm