GetCoding
  • Home
  • Courses
  • Contact
Author

Matt Heff

Matt Heff

Computer Scientist / Software Engineer - 18 years of professional experience - JavaScript / Solidity / Node.js / Web 3.0 / Tech / Industry - Content Creator. I'm here to help you to learn all the valuable lessons I learned about JavaScript, web development, blockchain, software engineering and tech in general.

Uncategorized

How to read environment variables with Node.js

by Matt Heff 2 months ago
written by Matt Heff

Node.js is a fast, scalable, and efficient framework in the web development community, known for its event-driven, nonblocking input/output structure. Node.js also has a convenient way of working with environment variables, allowing for easy configuration of a Node.js application

How to check you have Node.js installed on your machine

node -v

If it is installed you will see the version number returned, if node is not present you will get a error form the command line which is equivalent of saying

Gary Coleman Whatchu Talkin Bout Willis GIF - Gary Coleman Whatchu Talkin  Bout Willis Confused - Discover & Share GIFs

How to install Node.js

  1. Go to the Node.js Downloads page.
  2. Download Node.js for your Operating System
  3. Run the installer, including accepting the license, selecting the destination, and authenticating for the install.

Now when you open a new console and run node -v the command line will return the version you just installed

Using Node.js to access environment variables two Methods

1 – Access the environment variables already present

In Node.js, you can easily access variables set in your computer’s environment. When your Node.js program starts, it automatically makes all environment variables available through an “env” object. You can see the object by running the command “node” in your command line and then typing “process.env”.

You should see something like this

If we want to access a specific parameter, lets use SHELL for example we would want to run a command such as

console.log(`Shell is parameter is ${process.env.SHELL}`);

With this you can check what variables exist within the environment, you can update them, add some but these will only be available to that individual node process and any children it spawns. This is easy to access but it is not the best when you are working on multiple projects and need to specify environmental variable settings per project. For that we have another approach

2 – Access Environmental Variable using DOTENV

Dotenv is a module in Node.js allows you to easily load environment variables from a .env file in your project. This file should contain key-value pairs, with the keys being the names of the environment variables and the values being the values you want to set for those variables. Once you have your .env file set up, you can use the dotenv module to load the variables into your Node.js application. You will want to keep sensitive information in here examples of such items are API keys, besides that you can use it for anything else you don’t want to hard-code into your application.

Important: If you use git or other version controls make sure you do not include .env in version tracking. The last thing you need is this sensitive data pushed into a repo for someone to exploit.

Installing dotenv
npm install dotenv --save

After it has been installed, add the following line to the top of your js file

require('dotenv').config();

Create a .env File

After this is installed, you need to make a file called .env and put it in the top level of your projects file / folder structure. This file is the place to put all your environmental variables. When entering your variables there is no need to wrap strings in quotation marks, DotEnv does this automatically for you, below is an example .env file.

Example .env file contents

DB_HOST=db.getcoding.io
DB_USER=admin
DB_PASSWORD=password
DB_NAME=example

A example of a DB connection using the DB_HOST parameter

const dotenv = require('dotenv');
const { Client } = require('pg');

// Load environment variables from .env file
dotenv.config();

// Get the value of the DB_HOST environment variable
const serverName = process.env.DB_HOST;

// Create a new PostgreSQL client
const client = new Client({
  host: serverName,
  user: process.env.DB_USER,
  password: process.env.DB_PASSWORD,
  database: process.env.DB_NAME,
});

// Connect to the database
client.connect()
  .then(() => console.log(`Connected to PostgreSQL on ${serverName}`))
  .catch(err => console.error(`Error connecting to PostgreSQL at ${serverName}: ${err}`));

Once you have done this and saved it to a file (eg env-example.js) then run the file and see what happens

node env-example.js

Depending if you have an actual postgres Db that you can access with username/pass/database with the values above you will either see

Connected to PostgreSQL on db.getcoding.io
or

Error connecting to PostgreSQL at db.getcoding.io: Error: getaddrinfo ENOTFOUND db.getcoding.io

Other Issues — Package PG not found

If your environment does not have the required postgres package then you can installed it by using the following

npm install pg --save

Summary

Woo Hoo!! We have just learned and hopefully experimented two methods for using environmental variables in Node.js: dotenv and directly from the environment. Armed with this knowledge you can now easily store and access any information your projects in a secure and convenient way.

Have fun with it.

2 months ago 0 comment
TwitterLinkedin
Uncategorized

AI – What is Natural Language Processing (NLP)

by Matt Heff 6 months ago
written by Matt Heff

G’day mates! Are you curious about how computers can understand and generate human language? Well, you’re in luck – I also wanted to find out about AI and language. Since GPT has come along my obsession has turned towards Generative AI and AI in general, it has been a while since i studied AI and boy oh boy has the field come a long way since then so let me share what i know about natural language processing (NLP).

NLP is a field of artificial intelligence (AI) that focuses on making computers smart enough to understand and generate human language. It’s used in all sorts of applications, from chatbots to language translation to text analysis. And it’s an exciting and rapidly-evolving field that’s full of opportunities and challenges. At the moment this type of AI has exploded and looks to have advanced to a level that most of us that work with information will be able to benefit from its assistance. But how does it work?

At a high level, NLP involves two main stages: understanding language and generating language. Understanding language involves analyzing a piece of text and figuring out what it means, while generating language involves using a computer to produce text that sounds like it was written by a human. At this moment i personally believe with the correct prompt engineering you can get useable text/language from the AI.

To do this, NLP systems use a combination of rules and machine learning algorithms. Rules-based systems use pre-defined rules and patterns to analyze and generate language, while machine learning systems learn from examples of human language to develop their own rules and patterns. It’s like teaching a wee tot how to speak English – it takes a bit of training, but once they get the hang of it, they can be pretty darn good at it!

Overall, NLP is a fascinating field of AI that looks like it is going to revolutionize how we interact with technology and how we communicate. If you’re interested in computers and language, you should definitely check out NLP – it might just be the perfect field for you! And who knows, with a bit of hard work and a positive attitude, you might just be the one to teach that wee tot how to speak English!

6 months ago 0 comment
TwitterLinkedin
BlockchainFeaturedTechnologiesThought Provoking

Web3 RPC Nodes

by Matt Heff 10 months ago
written by Matt Heff

In this article we will look at Web3 RPC Nodes, Initially we shall explain remote procedure calls ( RPC ) in both web2 and web3. After this you will get a chance to interact with an RPC and dive into the internet of blockchains ( Cosmos ), The purpose of this is to help you understand the role of RPC Nodes in Web3 and highlight the caveat of centralised RPC infrastructure in decentralised network. Finally we will look at a hot new solution to this problem!

What is RPC?

Remote Procedure Calls are used in Client – Server systems, the behavioral flow is the client will send a request to the server and the server will respond with a result. During this exchange the client may specify a particular function or procedure and pass relevant parameters with it. 

Web 2 RPC’s

Let’s look at an example with two systems, Wordpress (CMS) and Moodle (LMS). We want to automatically enroll a customer in a Moodle course ( and create their account if it does not already exist ) after they purchase a course from Wordpress. In Moodle we configure the default provided RPC options or write some custom functionality if needed.  In wordpress we write or find a plugin for Moodle integration and then configure the Moodle Server RPC particulars. The systems can securely communicate via the security token generated in Moodle, and for specified functions such as userExists, createUsers, authUser. getUser courseList, enrollUser. 

RPC’s allow us to integrate with systems and easily read/change data within it for an intended purpose. It affords simpler integration with many systems through a system defined interface.

But what about Web3 RPC Nodes, you ask? Let’s get into that, if you are new to the space the term node is used instead of server.

Web 3 / Blockchain RPC Nodes 

RPC Nodes are our gateway to read from and interact with the blockchain through their exposed API interface. With the blockchain we cannot communicate directly with it as we can a database. These RPC Nodes are essential for our dApps and anything else want interact with the blockchain need to use these to interface with it. From the clients perspective they must submit our transaction with their wallet and this has a configured RPC Node, their transaction will be submitted through that access point. This action could be to transfer some tokens, interact with some smart contracts, trade, yield farm, stake or some other feature offered by a dApp. 

What if the RPC is offline?

If our configured RPC node is offline then our wallet should return some error and inform us, at this point a lot of people will be left scratching their head and unsure what to do. Perhaps they would just try again later. Searching for an RPC for your chosen blockchain may net you an alternative. Do you need to trust it?  Are all of them good actors?

If we have a dApp that we have built and it uses the RPC Nodes API for regular operations, you will be out of action unless you have incorporated a list of trusted backup RPC Nodes and the functionality to iterate through the list in the event of failure. The uptime and reliability of RPC Nodes are essential to the services provided and for the user experience.

How can we trust RPC?

This is one of the achilles heels of web3 at the moment, let’s say someone runs a RPC Node with the intention of theft. They download the source code and make some modifications to it. And convince users to configure this inside their wallets. Perhaps instead of processing a transaction they throw an error message and suggest a user go to their own phishing site where they have the user sign a transaction and drain their wallet. 

A similar event happened this year, a RPC Node became compromised and the attackers did this very ploy to phish users to their own website to deprive them of their assets.

Cosmos RPC Nodes & Tendermint

Cosmos is an Internet of blockchains as such there are a multitude of blockchain that exist and are interoperable. Tendermint allows for the integration of any blockchain into this ecosystem, it frees up application developers to be able to focus on their creations and not have to concern themselves with the gargantuan task of building a network or deciding on and implementing consensus methods. This modularity ( separation of application, network and consensus layer) is already the reason this network is growing at a breakneck speed. As more and more chains decide to take advantage of the Cosmos architecture and interoperability there becomes a large target placed upon these RPC Nodes for attacks, if this infrastructure exists on centralised services then there becomes a honeypot and compromises will have wide scale impacts.


Explore Cosmos via Tendermint RPC

Lets get our hands a little dirty!

If you are new to blockchain there is no better way to get familiar with it then getting your hands dirty. Take a look at the latest COSMOS block

View Latest Cosmos Block

If you prefer to look at this inside postman, below is a collection to get you started.

Tendermint RPC – SwaggerDownload

Now that you are interacting with data from the blockchain save the link below to explore more!

Tendermint RPC – Full API Description


Critical Infrastructure –  Should it be decentralized?

When we have billions of dollars and services that hundreds of millions use the underlying infrastructure, its reliability, security, transparency and motivations need to be considered. In the last two years their explosive growth has resulted in a lot of centralization of services which offer fast and quick results for developers and users ( a short term win! ), In the medium term this introduces risks and dilutes some of the founding principle cores. A core tenet of Web3 is decentralization, if that becomes diluted to the point where it barely exists at all levels then what is Web3? With that in mind can we trust centralized services to be the gatekeepers of a decentralized system?  

If you looked at the current cosmos block using the links or postman file above how do you know the data that is returned to you is accurate? What is stopping a centralised service from misrepresenting the chains state.

Will they restrict access arbitrarily upon government request?

Will they censor or curate information in ways that are not representative of the blockchains actual state?

Will they serve as a focal point for attacks to compromise networks and harm users?

If they go out of business will all the services built relying on them also disappear?

Many of us have been happy to forget about the value of decentralization especially when we are not encountering any of the downsides that come with centralisation. This is a core part of Web3 though not an immediately obvious one. When this matters it can be the difference of you having access to your assets or not. Being able to use the services built on blockchains or not.

It seems to me that now is the time to re-evaluate our relationship with how we interact with blockchains via RPC Nodes. As we move towards an interoperable ecosystem within cosmos. Centralized services create a single point of failure, that impact ripples to all chains which become part of the Cosmos ecosystem, such an expansive network which in time could include all chains. After all it is the internet of blockchains.

The solution – LAVA

In 2022 a visionary team was established to revolutionize the way we interact with blockchains. To ensure centralization and the points of failure they introduce do not go without challenge and are stopped before their corrosive impacts on decentralization become widespread, ingrained and exploited regularly.

It is my desire to be at the forefront of this mission, to help guide developers coming into the space in a way that promotes and spreads the core values of Web3, removes barriers of entry, resolve development issues and foster a community that will lead the whole industry into the next era. An era in which those who thrive will be decentralised and censorship resistant. It’s time to take the discussion beyond a network of value transfers, and bring it full circle to the values which established this entire space.  The future is hot, The future is liquid, The future is LAVA!

10 months ago 0 comment
TwitterLinkedin
DrupalLearningPHPProgrammingTutorial

Drupal 9 – Custom module development – Simple Module

by Matt Heff 11 months ago
written by Matt Heff

After working with drupal for years I am amazed by the number of custom modules out there and what they have unleased for business’s that have used them and continue to innovate more. The ones I refer to here are the publicly accessable ones, I believe there are just as many which are private and are highly specialised to the business that have them developed. This is the use case I am wanting to target here. These Custom Modules are ones you can create for a clients specific needs.

In Drupal 9 custom modules off there flexibility to create entirely new features and/or adjust existing ones to enhance a business’s growth or open up new lines of what they offer. It allows virtually unlimited creativity and possibility for owner of a Drupal system. For Developers if offers then a way to provide highly novel solutions for those busssiness.

The aim of this article is to help you build your first basic custom module in Drupal 9. Note these will also work for Drupal 8 ( D8 is end of life and migration plans ideally would be underway, if not, reach out to me and we may be able to work together on it)

Drupal 9 Module development

Lets getCoding! and create this module in just a few phases.

Phase 1: The Name

A great place to start is to decide on the name of your module and then we can create it under the folder path “/modules/custom”. The one ill choose for this article is “simple_module“. You will notice my root folder is called drupal_tutorals yours may be configured as web, root or something else

Rules for naming:

  • start it with lower case letters, (simple_module, not, Simple_Module)
  • no spaces in the name. ( simple_module, not, simple module)

done!, lets get onto the next phase

Phase 2: Create info.yml file ( for Drupal to recognise it)

Next we need to create a .yml file. If you are unfamiliar with what a .yml ( or .yaml) file is it is a format that is great for storing configuration it is a superset of JSON and not a markup language, if you want to learn more about JSON read this article.

So in short the .yml is a configuration file for our module which allows drupal to know about it. We create a file in our new folder (simple_module) and call it simple_module.info.yml, put the following code into it

name: Simple Module
type: module
description: 'Demo a creating a simple Drupal 9 Module.'
package: Custom
version: 0.5
core_version_requirement: ^8 || ^9

Lets look at what each of these config items is

name: This is the name that will get displayed in the modules list in our Drupal admin section

type: this will be either module or theme, in our case module

description: a longer description of the module that will appear in the Drupal admin section under the name

package: specify this as being custom module

version: this is our module version, I have put 0.5 as it is in dev, for a full ready release it should go out as 1.0 )

core_version_requirement: let it know which versions of Drupal this will work with ( 8/9)

That is our config done! Nice we are almost there now we just need to add our Router file and our controller!

Phase 3: Create Module Routing, routing.yml

I really like the routing system in Drupal 8/9, it replaces the routing parts of the hooks_menu() that was in Drupal 7. The routing system is mostly based on Symfony’s, so where I say mostly it can do everything that Symfony’s can and more, both Drupal 8/9 and Symfony use the same syntax to define routes.

In short: Routing is where our pages will be accessed via URL’s

Create a file simple_module.routing.yml in our simple_module folder and add the following

simple_module.demopage:
  path: '/simple/demopage'
  defaults:
    _controller: '\Drupal\simple_module\Controller
\SimpleController::demopage'
    _title: 'Simple Demo page for Drupal 9'
  requirements: 
    _permission: 'access content'

So lets unpack what we have here in the simple_module.routing.yml file

Line #1: is the route name

Line #2: The URL page this route will be regestered to eg https://demo.com/simple/demopage

Line #4: Is a reference to the _controller and a demopage class() that is within this controller ( we will be making this file next)

Line #5: The _title ( default title ) of the page for the controller we specify in Line #4

Line #7: List of _permission that are needed to access the module, here we just want a public page so access content is the only one needed, if we were creating an admin page or something within a user profile then we would add some more restrictive permissions

So far we have now got the shell of our module ( phase 1 & 2 ) the routing information of where to access the page we are making ( phasev3) . Now for our final phase.

As mentioned on Line #4: The controller is what we need next so that we can build out the meat of our custom module and put in the actual behavours of what will be on the page this is what our controller will do

Phase 4: Create a Controller

In the folder we created in step 1 (/modules/custom/simple_module) we need to create some additional folders

(/modules/custom/simple_module/src/Controller) and also a file “SimpleController.php” and in it put the following

<?php
namespace Drupal\simple_module\Controller;
class SimpleController {
  public function demopage() {
    return array(
      '#markup' =----> 'Congratulations, you have made your a Custom Module in Drupal 9!</br>Time to Celebrate!</br>Don't forget to reward yourself!'
    );
  }
}

From here we are done with the code!

Your files should look something like this

Log into your Drupal installation and enable the simple module we just created. As always we need to check that what we have done works so navigate to /simple/demopage , you may need to clear the cache if it does not work.

admin / configuration / performance and then clear cache on that page.

Try the location /simple/demopage again.

Thats the end. Whilst the module itself does not do anything you now are able to create a custom module and then implement what it is your client or company needs.

You can access the code for this module at this GitHub repo

11 months ago 0 comment
TwitterLinkedin
BlogFeaturedJavaScriptNodeJSProgrammingTutorial

Create a RESTful API in NodeJS

by Matt Heff 11 months ago
written by Matt Heff

Ever wanted to create your very own API or do you just need some syntax for something that you are currently working on? Well we have you covered!

Tools you will need

  • Visual Code
  • Command line
  • Node / npm installed
  • Postman ( or some way to interact with the API)

Navigate to the directory you want to set up this project in

cd /home/heff/Code/NodeJS/ApiTutorial-Simple
npm init
npm install express cors
code .

Create Products.JSON

Create a file name products.json in the root directory and put the following JSON into it, this is a simple product database that we will read/update/create/delete ( get, put, post, delete ).

I have no intention of permanently altering this data, we will load it into the memory of the service and then modify it but after the service is terminated nothing will be retains. This is especially helpful once we start deleting.

[
  {
    "id": "Prod1",
    "name": "Online Product #1",
    "type": "online",
    "qty": null,
    "price": 10.5
  },
  {
    "id": "Prod2",
    "name": "Shipping Product #2",
    "type": "physical",
    "qty": 55,
    "price": 17.5,
    "shipping_price": 10
  },
  {
    "id": "Prod3",
    "name": "Online Course Product #3",
    "type": "physical",
    "qty": 55,
    "price": 17.5,
    "shipping_price": 10
  }
]

Create a Server.js file

Create a server.js file and this is the one we will be having node execute for us we will put together in sections and test along the way the sequence will be

  • Load the Products Data
  • Initialise the API Service
  • Create GET
    • Get All
    • Get by ID
  • PUT
  • POST
  • DELETE

ok now we are ready to GetCoding!

Load the Products Data

Run: node server.js

let products = require("./products.json");

console.log(products)

use the command node sever.jsAfter that you should see the products JSON object written to the console like in the image below

Great! Now lets make a service that will listen to us

Initialise the API Service

Add the following code to your server.js file what we are doing here is getting our service setup and configured so it will sit and listen for instruction i have included some CORS handling incase anyone tries to do test this via their browser console instead of postman

const express = require("express");
const cors = require("cors");
const app = express();

var corsOptions = {
  origin: "http://localhost:9999",
};

app.use(cors(corsOptions));
// parse requests of content-type - application/json
app.use(express.json());
// parse requests of content-type - application/x-www-form-urlencoded
app.use(express.urlencoded({ extended: true }));


////////////////////////////////////////
//Insert GET/PUT/POST/DELETE functions in this section

//Get functions
app.get("/product", function (req, res) {
  res.json(products);
});

app.get("/product/:id", function (req, res) {
  let item = products.find((product) => product.id === req.params.id);
  res.json(item);
});

//put functions

// post functions

// delete functions

////////////////////////////////////////


var server = app.listen(9999, function () {
  var host = server.address().address;
  var port = server.address().port;
  console.log(`Simple API listening on host: ${host} with port: ${port} `);
});

if you run this now you will have a service start up and sit there listening on port 9999, how cool is that!

You can use postman now to perform a get on

  • http://localhost:9999/product
  • http://localhost:9999/product/Prod1

Postman gives the following results,

At this stage you now have a very basic API! If this is your first time congratulations! I hope you remember me!

Try out looking for Prod2 & Prod3 even a Prod4 that does not exist to see what happens.

PUT

Add the following code under the comment //PUT functions

Here we need to make sure the data send to us is not empty, from there we look for where our item is in the product list and once we find it ( assuming we do ) we will update it with the JSON sent, the whole record must be sent, sending only partial data will result in data missing.

app.put("/product/:id", function (req, res) {
  if (!req.body) {
    res.status(400).send({
      message: "Content can not be empty!",
    });
  }
  console.log(req.body);
  //find the product by id and update it 
  let index = products.findIndex((x) => x.id == req.params.id);
  products[index] = req.body;

  console.log(products);
  res.json(products);
});

POST

Add the following code under the comment //POST functions

Here we need to make sure the data send to us is not empty, from there we look for where our item is in the product list and if we don’t find it we create a new one otherwise let them know one already exists.

app.post("/product", function (req, res) {
  if (!req.body) {
    res.status(400).send({
      message: "Content can not be empty!",
    });
  }
  // check to see if there is a product that exists with that id
  let index = products.findIndex((x) => x.id == req.body.id);
  if (index !== -1) {
    res.status(400).send({
      message: `Item already exists wth the ID: ${req.body.id}`,
    });
  }
  //add new product to the list 
  products.push(req.body);
  res.json(products);
});

DELETE

Add the following code under the comment //DELETE functions

The code below looks for a matching product with the value that was passed to it and if it finds one it will remove it from the product list.

app.delete("/product/:id", function (req, res) {
  //here we remove the product from our list if it is found
  let tmp = products.filter((item) => item.id !== req.params.id);
  products = tmp;
  console.log(products);
  res.send(products);
});

GITHUB

You can find the code for this in github as well as a collection from postman so you can test out the API

GitHub Repo

NodeJS.postman_collectionDownload
11 months ago 0 comment
TwitterLinkedin
BlogJavaScriptProgramming

How to Add Hours to a Date Object in JavaScript

by Matt Heff 11 months ago
written by Matt Heff 5 read

As if programming and dealing with dates did not have enough headaches that come from timezones, daylight savings, leap years the JavaScript Date API does not offer us a way to add hours to Date objects.

Fortunately with this tutorial we are going to do this ourselves and do it a couple of ways.

Method 1: Create a function ( use Date.setHours() )

function addHours(dateObj, hours){
   //We create a new Date Object so that we dont alter the one passed into the function (this is a pure function )
   const dateUpdated = new Date(dateObj.getTime());
   dateUpdated.setHours( dateUpdated.getHours() + hours);
   return dateUpdated;
}


const dateDemo = new Date('2022-06-22T12:00:00.000Z'); 
console.log(dateDemo); //2022-06-22T12:00:00.000Z

let dateModified = addHours(dateDemo,1)
console.log(dateModified); //2022-06-22T13:00:00.000Z

dateModified = addHours(dateDemo,3); //Original Date is not modified
console.log(dateModified); //2022-06-22T15:00:00.000Z

dateModified = addHours(dateModified,3)
console.log(dateModified); //2022-06-22T18:00:00.000Z


In the above function we create the variable dateUpdated and create a new Date Object so that the dateObj that is passed does not get modified. The Date.setHours() function updates the object, the above implementation is a pure state function, this will help prevent unexpected behaviours.

Method 2: Use NPM Package date-fns

If you are inclined to use NPM packages then this is a great one to make dates a little nicer in JavaScript or NodeJS

So this assumes you have NodeJS installed or an IDE where you can install NPM packages, you can install the package via the command below. We are only going to focus on demonstrating the addHours functionality.

npm install date-fns

import { addHours } from 'date-fns';

const date = new Date('2022-06-22T12:00:00.000Z');

//Add 1 hour to the existing date
let newDate = addHours(date, 1);
console.log(newDate); // 2022-06-22T13:00:00.000Z

//Add 2 hour to the already updated date
newDate = addHours(newDate, 2);
console.log(newDate); // 2022-06-22T15:00:00.000Z

//Original Date Object has not been modified
console.log(date); // 2022-05-15T12:00:00.000Z

The NPM package offers loads of date functions if you are in a situation to use this i would recommend it, anything that makes our lives easier with dates is worthwhile!

A few examples of date-fns functions

addDays, addHours, subHours, addMinutes, subMinutes, addSeconds, subSeconds, subDays, addWeeks, subWeeks, addYears, subYears,

Code Repo

Github: Example 1

Github: Example 2

11 months ago 0 comment
TwitterLinkedin
BlogJavaScriptProgrammingTutorial

Create your own JSON.stringify() function

by Matt Heff 11 months ago
written by Matt Heff

If you have been around the traps the JavaScript for a while you will know JSON.stringify(), you may have have even used it to compare objects, make clones of objects and arrays or even work with RESTful API’s. Using it is not enough of people like you and me, we must pull it apart, look under the hood and then make our own version of it.

This is a two way operation JSON.stringify() will turn objects into strings the inverse of that is JSON.parse() which will objectify strings. This is a good type of objectification.

Example of JSON.stringify() and JSON.parse()

const demoJsonObj = {
  mission: "CreateJsonStringify",
  timeToComplete: 10,
  useExistingFunction: false,
  reasons: ['Inquisitive','Masochist','Bragging Rights'],
  toolsAllowed:{
    language: "JavaScript",
    enhancer: "Coffee",
    motivation: "Binaural Beats",
    alternativeMethod: "Youtube - Check out my video tutorial"
  }
};

console.log(JSON.stringify(demoJsonObj));
// Output
//{"mission":"CreateJsonStringify","timeToComplete":10,"useExistingFunction":false,"reasons":["Inquisitive","Masochist","Bragging Rights"],"toolsAllowed":{"language":"JavaScript","enhancer":"Coffee","motivation":"Binaural Beats","alternativeMethod":"Youtube - Check out my video tutorial"}}


//Test the string is valid by parsing it back
console.log(JSON.parse(JSON.stringify(demoJsonObj)));

Ok that was nice and easy, we can pass the JavaScript Object and get a JSON String. Job done for 99% of folks out there. Not for us! Roll up your sleeves as we are only just about to GetCoding!

Where to begin?

Shall we jump in and start bashing out some code to start with? Or shall we have a think about what we are going to be doing here?

I’d start bashing on the keyboard, only to get 5 mins and and be like hmmmm. So here I am faced with the question how to we take a JSON Object and convert it into a JSON String?

What is a JSON Object? What can be in it? Do we need to cover all cases? Are there any weird things that could happen?

How do I go through each object item? What if it is an array of items? What if there is a object in the object? What if there is a array in the object or in the objects object array . . . . .does that even make sense?

Hmmm thats a lot of questions, Lets start with what is a JSON Object W3schools and MDN — I found the W3Schools page better it gave me a nice convenient list of what are valid JSON types

  • string
  • number
  • object
  • array
  • boolean
  • null

It also gave an example on how to loop through a JSON object. We are definitely going to need this!

//looping through the JSON Object using the object above
//variable is -- demoJsonObj -- just to remind you

for (const item in demoJsonObj) {
  console.log(item);
}

//This is handy, we can loop through the object and output each item. 

//Can you think how else we can use it?

This is excellent at least we have a starting point now, we know a bit about JSON and we definitely know that we need to handle these types. At some point we are going to need to check if the data we are looking at so lets make that code now.

Here your style preference can change how you write this, you could use if statements, switch or functions for each data type and return true/false.

function isStr(val){
    return typeof val === 'string';
}

function isNum(val){
 return typeof val === 'number';
}

// when checking a object we must ensure it is not null and not an array. Both null & Array has typeof object in JS
function isObj(val){
    return typeof val === 'object' && !Array.isArray(val) && val !== null;
};

// We need to check that it is an Array and also of type object

function isArray(val){
    return Array.isArray(val) && typeof val === 'object';
};

function isBool(val){
  return typeof val === 'boolean';
}

function isNull(val){
    return typeof val === 'string';
}

So that handles our Data types listed by W3Schools, there was a bit of knowledge I did have about type of for null and arrays returning as objects. If you didn’t have that you may get some weird results.

So now lets build out some code that will get a JSON Object and then loop through each item and check its type.

function CustomJsonStringify(obj){

 //check if it is string
  if(isStr(obj) ===true){
    return `"${obj}"`
  }
  if( isNum(obj) || isBool(obj)){
    return `${obj}`;
  }

  if(isArray(obj)){
    let arrayString = "";
    obj.forEach((val) => {
      arrayString += CustomJsonStringify;
      arrayString +=','
     });

    return `[${arrayString.replace(/,*$/, '')}]`;
  }




  // loop through each item in the object and handle it
  if(isObj(obj)){

        let objectString = ""; 
        let keys = Object.keys(obj);

    keys.forEach((key) => {
      let val = obj[key];
      objectString += `"${key}":${CustomJsonStringify(val)},`;

     });
     return `{${objectString.replace(/,*$/, '')}}`;

    }
  }


console.log(CustomJsonStringify(demoJsonObj));

//Pass string into JSON.parse() to check it is valid
console.log(JSON.parse(CustomJsonStringify(demoJsonObj)));

Conclusion

Here we have a basic stringify function , there are fringe cases which are not handled. and to make it more robust and a 1:1 equivalent we would need to add handling for

  • functions, undefined and Symbol values
  • Dates
  • NaN, Null and Infinity
  • Undefined

Hope this has been helpful to you, Ill have a youtube video going through this in the near future!

Code on GitHub

11 months ago 0 comment
TwitterLinkedin
Uncategorized

How to add additional file types in Wordpress

by Matt Heff 11 months ago
written by Matt Heff

This is the simplest way to give yourself complete freedom to put whatever type of file you want up in wordpress, the standard list of files is far to narrow for most sites that are going to be delivery more than just text as content.

Let’s get straight to it!

Allow Additional File types using wp-admin

This method requires that you have access to the file system and can modify the wp-config file, if you are not able to do this then you are better off using a plugin or modify the functions.php file ( more on that below).

Steps to reliever your frustration

  • Navigate to your WordPress installation directory
  • Make a copy of your wp-admin.php ( just incase)
  • Open wp-admin.php in any editor
  • Add this to the file
define ( ‘ALLOW_UNFILTERED_UPLOADS’ , true );
  • Now save the file into your WordPress directory, flushing the caches may help it take effect quicker.

* Note: Some plugins / themes may have their own file filters in place, but this is guaranteed to work with the core media library.

Allow Additional File types using functions.php / upload_mimes filter.

If you dont have direct access to the file system then this may work for you

  • Navigate to via Appearance -> Theme File Editor
  • Select the functions.php file
  • Add the following code
function my_mime_types($mimes) {
    $mime_types = array(
        'json'     => 'application/json',
    );
    return $mimes;
   }
add_filter('upload_mimes', 'my_mime_types');

define( 'ALLOW_UNFILTERED_UPLOADS', true );  // this is the same setting as we has in wp-config.php in the previous section
  • Save the modifications to the functions.php and you will be good to go!

That is pretty much it for this entry, below is a list of file types you may be interested in these are mostly there to help me catch interest from specific searches for those file types.

I hope this has been helpful for you.

Types of files to allow in WordPress

Documents & Sheets

  • .docx & .doc (Microsoft Word Document)
  • .rtf (Rich Text Format File)
  • .tex (LaTeX Source Document)
  • .log (Log File)
  • .pdf (Adobe: Portable Document Format)
  • .xlsx, .Xls (Microsoft Excel Document)
  • .pptx, .ppt, .pps, .ppsx (Microsoft Powerpoint File)
  • .odt (OpenDocument Text Documet)

Audio Files

  • .wav (WAVE Format)
  • .mp3 (MPEG3 Format)
  • .ogg (OGG Multimedia Container)
  • .m4a (Advanced Audio Coding)
  • .aif ( Audio Interchange File Format)
  • .wma ( Windows Media Audio File)

Image Files

  • .jpeg & .jpg (Joint Photography Experts Group)
  • .psd (Adobe Photoshop Document)
  • .png (Portable Network Graphics)
  • .gif (Graphics interchange Product)
  • .ico (Icon File Extension)
  • .obj (Wavefront 3d Object File)
  • .3ds (3D Studio Scene)
  • .tif (Tagged Image File)
  • .tiff (Tagged Image File Format)

Video Files

  • .wmv (Windows Media Video)
  • .rm (RealMedia File)
  • .flv (Flash Video File)
  • .mpg (MPEG Video)
  • .mp4 (MPEG4 Format)
  • .m4v (Video Container Format)
  • .mov (Quick Time Format)
  • .avi (Audio Video Interleaved Format)
  • .ogv (OGG Vorbis Video Encoding)
  • .3gp (Mobile Phone Video)

Data Files

  • .CSV Comma-Separated Values File
  • .DAT Data File
  • .GED GEDCOM Genealogy Data File
  • .JSON JavaScript Object Notation File
  • .KEY Apple Keynote Presentation
  • .KEYCHAIN Mac OS X Keychain File
  • .PPT Microsoft PowerPoint Presentation (Legacy)
  • .PPTX Microsoft PowerPoint Presentation
  • .SDF Standard Data File
  • .TAR Consolidated Unix File Archive
  • .VCF vCard File
  • .XML XML File

11 months ago 0 comment
TwitterLinkedin
BlockchainBlog

What is an AMM ( Automated Market Maker)

by Matt Heff 12 months ago
written by Matt Heff

With the explosion of DEFI onto the world its time to pull it apart and make sense of what it is and what it offers us now that yet another crypto Djinn is out of the bottle.

Summary

Automated market makers (AMMs) are a major part of the decentralised finance ( DeFi ) ecosystem. These forms of magic allow crypto currencies and other digital assets to be traded in a permissionless and automated way. They use pools of liquidity rather than requiring a seller and buyer in a market place. The prices are worked out via math. Liquidity can be provided by any one and incentives are offered for doing so, keep in mind they are not risk free.

Topics covered

  • What is an AMM
  • How do they work?
  • Liquidity Providers and Liquidity Pools
  • Slippage
  • The Math!

What is an AMM

Automated Market Makers ( AMM ) allow crypto currencies/digital assets to be traded in a permissionless way and automatically using liquidity pools, these differ to traditional markets because both buyers and sellers are not needed, in these markets the buyer and seller agree upon a price and then the transation takes place. With AMM’s transactions can take place at anytime so long as the buyer is happy with the rate that is currently available.

Decentralised Exchanges that us AMM’s

With Decentralised Exchanges ( DEX ), there is not 3rd party that mediates the exchahge. With AMM’s the algorithm will utilise the liquidity form that which is provided to settle the transaction at the current rate. The algorithms can vary between dex’s but the intention is to have deep liquidity, cheap to use and be available all the time.


Liquidity Providers and Liquidity Pools ( Yeild farming )

Liquidity relates to how easy it is to buy/sell or convert an asset. A house is not very liquid as it takes a long time to sell it and for the asset to be transferred. Cash is liquid, you can take it and convert it to many assets / goods or services and requires little time to settle the transaction. As Decentralised Finance started out it was difficult to find people on the network to trade crypto currencies with, so there was a liquidity problem. Finding a person that wanted to trade the same crypto currencies and also the same amount was a task in itself, AMM’s fix this but creating liquidity pools of assets, the users who provide liquidity are offered rewards and incentives for doing so, the greater the amount of assets in the pool the deeper the liquidity and the easier it is to trade on DEX’s

With traditional exchanges users buy and sell with other users, these AMM’s allow users to trade against the pool of tokens. the pools usually consists of two tokens eg BTC and USDC and this is used for people wanting to trade BTC for USDC or USDC for BTC. The price is worked out by a formula and these forumulas can be different between different DEX even between different pools on a single DEX.

When a users alters the amount of tokens in the pool the ratio changes and as result the price does.

Anyone who has tokens ( usually ERC-20) is able to provide liquidity and earn for doing some, incentives come in the form of trading fees ( typically 0.1% per trade ) and bonus rewards usually in the dex’s native tokens or some other offering to entice people to provide liquidity. this is known as Yield farming

Slippage

When going to use a DEX and trade one token for another depending on how deep the liquidity is and the size of the trade you will encounter slippage, this is the deviation of the current price to the price you will pay per token. This is a result of you moving the market and if the slippage quite is high your trade will be large relative to the pools liquidity.

A simple example here is lets say you want to swap USDC for USDT, both are USD Stable coins so equal to $1. If we swap 100 USDC and have a slippage of 5% then we can expect to get 95 USDT. So 1 USDC will trade for 0.95 USDT.

If that same pool had a lot more liquidity then we may get a slippage of 0.1% (basically just the AMM trading fee ) and in that case 100 USDC would get is 99.99 USDT.

Constant Price Formula

The elegance of math and what allowed this bit of technology to be unleashed is the following simple fomula:

Constant product formula

x * y = k

Here X represents the amount of AssetX in the pool and Y represents AssetY whilst K is a constant. If someone buys AssetY and sells AssetX for it then the relative price of AssetY to AssetX will change. We remove some AssetY and add some AssetX more X and less Y results in Y relative price increasing and X decreasing to equate the constant K.

I like to think about it as a see saw, as there are more X its price goes lower, less Y so it goes higher ( relative to each other). This creates arbitrage opportunities for traders and keeps the pools close to the values on exchanges.

There are a range of formula to equate values of tokens in pools.

Constant sum formula

x + y = k

Constant mean formula

 (x1 * x2 * … * xn)(1/n) = k

There are many more and each variation tries each tries to achieve an outcome, an edge for their users. Lower slippage, lower impermanent loss or even to increase slippage for very large orders that will unbalance the pool.

Conclusion

AMM’s have been wildly successful and are one of the bedrock technologies of decentralise finance. They are not risk free but they also off there average person to participate in financial market services and make yield when providing liquidity.

12 months ago 0 comment
TwitterLinkedin
BlockchainBlogTutorial

Install Metamask and get Testnet Tokens

by Matt Heff 12 months ago
written by Matt Heff

This article will cover

  • Installing the MetaMask Extension in your browser
  • Creating a wallet
  • Getting Testnet ETH tokens

What is Metamask

Metamask is arguably the most populate cryptocurrency wallet which allows you to store Ether and other ERC-20 tokens, it will also let you interact with decentralised applications ( dapps).

Installing Metamask

Navigate to the MetaMask Download page select your supported browser ( if you are using a different browser, then its best to get one of these to follow on with the rest it ). Allow the prompt to install the extension.

Once it is installed you will see a icon in your browser

Click on the icon, if MetaMask does not open up on its own.

When you see this page click on Get Started

At this point we need to create a New Wallet so click on “Create a Wallet”

Select “Create a Wallet”

If you want to help improve metamask you can agree to the request, or decline. It makes no difference with what we will be doing.

Next you get prompted to create a new password, do so and click create. You will need this address each time you start a metamask session ( it is not the private key to your wallet, we will get to that shortly!)

Secret Recovery Phrase

You can watch a short video on what this is, this super important to keep for your wallet that has real crypto funds in it!. For our dev one we will need is if we look to use this same wallet outside of MetaMask, so make a note of it. If you don’t, then don’t sweat it, we can get it from metamask later on.

Record the secret recovery phrase

When you see this screen, click on the grey box with a lock on it and you will find 12 words in there, this is your mnemonic, it is they key to your wallet. The keys to your vault if you consider yourself the bank. For your real wallet keep them safe and away from prying eyes.

Click Next once you are done here and re-enter the words to verify you have them. After this you are done!

You have now installed Metamask and have your very own Crypto Wallet!

Let’s get some Testnet Tokens!

First you may be asking, what the hell is a testnet? In sort it is a network where we can play around and not need to spend any money to test our code or play with crypto. The testnets do not have anywhere near the same amount of toys to play with as the mainnet.

There are a few Testnets for ETH you will need to enable show/hide Testnets in metamask. To do this click on Ethereum Mainnet and then click on the link Show/Hide test networks.

Click on Ethereum Mainnet then Show/hide test network
You are taken to config page and need to enable Show test networks

You can now see the test networks

Free Tokens from the faucet

A faucet is a place we can be drip fed some ETH for our transactions. we are going to get them from Kovan Faucet

Go to the Kovan Faucet and click on Connect Wallet, there is a button in upper right or in the main box on the screen. Once you click you get asked if you want to use Metamask or other wallets, select MetaMask

MetaMask will pop up and ask which wallets you want to allow to connect to this site & to authorise connection for those wallets

Cick on Next, then click on Connect

Error!!!

If you see this message after connecting your wallet then we need to change the network MetaMask is connected to.

Change Metamask to the Kovan Test Network

Click on the Metamask Icon, the wallet will pop up and then click on Ethereum Mainnet, you get presented with a list of networks select Kovan.

Select the Kovan Network
When you have selected the Kovan Network metamask will look like this

This Faucet is very smart and detects this change, you will see a page like the image below, you need to prove you are not a robot and then can click send request and the faucet will initiate the transaction to send you 0.1 test ETH and 20 test LINK ( uncheck link this if you dont want the link )

Faucet detects we are on the kovan network and is ready to drip feed us.

Initiate the drip feeding! once you click on Send request you will need to wait whilst the transaction process’s on the blockchain. it will go through a few phases and at the end you will see some ETH in your metamask wallet

Show me the ETH!

Open up MetaMask and you will see you now havev 0.1 eth, if you are wondering

There you have it! You have setup metamask and have some ETH for the testnet!

If you decided to get some LINK as well and are wondering where it is? Well dont fear it is in your account on the blockchain, but metamask does not know about this token yet. you can use the import tokens link and paste (0xa36085F69e2889c224210F603D836748e7dC0088) this in to the Token Contract Address field, MetaMask will detect the other values needed.

Add the tokens contract
Import the token in to MetaMask
See it in your balances

You are now ready to play in the testnet! woo hoo.

If you are stuck thinking what you can do,

  • Try setting up a 2nd account and sending some value ( ETH or LINK )to yourself!
  • Write your first smart contract
12 months ago 1 comment
TwitterLinkedin
Newer Posts
Older Posts

Recent Posts

  • How to read environment variables with Node.js
  • AI – What is Natural Language Processing (NLP)
  • Web3 RPC Nodes
  • Drupal 9 – Custom module development – Simple Module
  • Create a RESTful API in NodeJS

About

About

GetCoding is a blog and tech educational resource specialized in programming, Web3, blockchain and everything in between. Follow us to help you understand and GetCoding!

Stay Connect

Twitter Linkedin Youtube Email

Popular Posts

  • 1

    Writing your first Ethereum Smart Contract

    12 months ago
  • 2

    Create your own JSON.stringify() function

    11 months ago
  • 3

    Introduction to JSON with Examples

    1 year ago
  • 4

    Install Metamask and get Testnet Tokens

    12 months ago
  • 5

    What is an AMM ( Automated Market Maker)

    12 months ago

Categories

  • Blockchain (4)
  • Blog (7)
  • Featured (3)
  • Learning (1)
  • Programming (5)
    • JavaScript (3)
    • NodeJS (1)
    • PHP (1)
  • Technologies (2)
    • Drupal (1)
  • Thought Provoking (1)
  • Tutorial (6)
  • Uncategorized (3)

Recent Posts

  • How to read environment variables with Node.js

    2 months ago
  • AI – What is Natural Language Processing (NLP)

    6 months ago
  • Web3 RPC Nodes

    10 months ago

Featured Posts

  • How to read environment variables with Node.js

    2 months ago
  • AI – What is Natural Language Processing (NLP)

    6 months ago
  • Web3 RPC Nodes

    10 months ago

Subscribe Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

  • Twitter
  • Linkedin
  • Youtube
  • Email
GetCoding
  • Home
  • Courses
  • Contact