Welcome to curated list of handpicked free online resources related to IT, cloud, Big Data, programming languages, Devops. Fresh news and community maintained list of links updated daily. Like what you see? [ Join our newsletter ]

Distributed computing system models


Tags distributed programming learning app-development software-architecture

Distributed computing refers to a system where processing and data storage are distributed across multiple devices or systems, rather than being handled by a single central device. By @geeksforgeeks.org.

The article main sections:

  • Physical model
  • Architectural model
    • Client-Server model
    • Peer-to-peer model
    • Layered model
    • Micro-services model
  • Fundamental models
    • Interaction model
    • Remote Procedure Call (RPC)
    • Failure model
    • Security model

Failure Model – This model addresses the faults and failures that occur in the distributed computing system. It provides a framework to identify and rectify the faults that occur or may occur in the system. Fault tolerance mechanisms are implemented so as to handle failures by replication and error detection and recovery methods. Excellent read for anybody interested in distributed systems!

[Read More]

Notes on teaching Test Driven Development


Tags tdd programming learning app-development software

Notes from interesting exercise where author was helping a client learn how to apply Test Driven Development and developer testing from scratch. The developer in question was very inquisitive and trying hard to understand how best to apply testing and even a little TDD. By @jeremydmiller.

The notes summary:

  • The purpose of an automated test suite is to help you know when it’s safe to ship code and provide an effective feedback loop that helps you modify code.
  • Test Driven Development (TDD) is primarily a low-level design technique and an important feedback loop for coding.
  • When applying TDD, consider how you’ll test your code upfront as an input to how the code is going to be written in the first place.
  • Approach any bigger development task by first trying to pick out the individual tasks or responsibilities within the larger user story.
  • Focus on isolating validation logic into its own function where you can easily test inputs and do simple assertions against the expected state.

What author absolutely did tell his client was to try to approach any bigger development task by first trying to pick out the individual tasks or responsibilities within the larger user story. In the end, you want to be quick enough with your testing and coding mechanics that your progress is only limited by how fast you can think. Nice one!

[Read More]

Streams in Scala - introductory guide


Tags akka scala programming learning streaming queues

Streams in Scala provide a lazy evaluation mechanism where elements are computed on-demand rather than being eagerly evaluated and stored in memory. This allows for efficient memory utilization, especially when dealing with large datasets or potentially infinite sequences of data. By Aniefiok Akpan.

There are many reasons for using a stream-processing approach when writing software. In this blog post I’m going to focus on just one of those reasons: Memory Management.

The article covers the following topics:

  • Why Streams
  • Scala Stream
  • Call-by-name ( CBN )
  • A Simple use-case of Scala Stream
  • Alternative Libraries that implement Streams
// With LazyList, the content of the files is not loaded into memory
files.map {
 case (file) =>

The author highlights the advantages of processing elements one at a time and retaining only the necessary elements in memory. With streams, you can confidently tackle memory-intensive tasks, knowing that the memory footprint is optimized, leading to more stable and scalable applications.There is also GitHub repo provided showcasing stream use. Good read!

[Read More]

What is the difference between tech debt and code debt?


Tags management cio learning agile

The article explains the difference between code debt and technical debt, two concepts that are often used in software development. Code debt refers to the potential cost that developers incur when they take shortcuts or implement quick fixes during the coding process, such as hard coding values, duplicate coding, or using deprecated libraries. By Sofia Jürgenson.

The article also discusses:

  • Understanding code debt
  • Decoding technical debt
  • How code debt and technical debt
  • Addressing code debt and technical debt:
    • Prioritize regular code refactoring
    • Adopt agile methodologies
    • Incorporate debt into the definition of done
    • Implement automated testing and continuous integration
    • Document everything
  • Code debt and technical debt management with no-code platforms

Good documentation is vital for managing technical debt. It forms a knowledge base that provides understanding about the system, making it easier to maintain and upgrade existing functionalities and technologies. Good read!

[Read More]

Measuring trends in Artificial Intelligence


Tags ai data-science cio learning big-data

The AI Index is an independent initiative at the Stanford Institute for Human-Centered Artificial Intelligence (HAI), led by the AI Index Steering Committee, an interdisciplinary group of experts from across academia and industry. The annual report tracks, collates, distills, and visualizes data relating to artificial intelligence, enabling decision-makers to take meaningful action to advance AI responsibly and ethically with humans in mind. By stanford.edu.

The latest edition includes data from a broad set of academic, private, and nonprofit organizations as well as more self-collected data and original analysis than any previous editions, including an expanded technical performance chapter, a new survey of robotics researchers around the world, data on global AI legislation records in 25 countries, and a new chapter with an in-depth analysis of technical AI ethics metrics. The 2022 AI Index Report is split into five chapters:

  • Research and development
  • Technical performance
  • Technical AI ethics
  • The economy and education
  • AI policy and governance

Despite rising geopolitical tensions, the United States and China had the greatest number of cross-country collaborations in AI publications from 2010 to 2021, increasing five times since 2010. The collaboration between the two countries produced 2.7 times more publications than between the United Kingdom and China—the second highest on the list. Very interesting!

[Read More]

Artificial intelligence is a very real data center problem


Tags ai data-science cio database big-data

It would be stupid for us to not consider the consequences as tools like Open AI’s ChatGPT or Google’s Bard for example to proliferate and introduce machine intelligence to everyday people. That includes how our data centers are evolving amid the rapid growth in data that needs to be stored, processed, managed, and transferred. By Dr. Michael Lebby.

AI could be the Achilles heel for the data centers unable to evolve in the face of massive datasets required for AI. The artticle the focuses on:

  • From the Agora to hyper connected global markets: the rise of AI and modulators
  • Survival by the numbers: measuring the strain of AI
  • Avoiding data traffic jams
  • Gauging the impact of AI
  • Alleviating data center strain

If we look at the growth of computing power in high computational processing systems over the past 60 years we know that this growth has initially increased or doubled every 3-5years. Then from about 2020 onwards, the growth has increased by over an order of magnitude, or 10X, to a doubling of computational power every 3-4 months (in terms of petaflops - which is a metric for computational processing magnitude).

While AI is expected to grow in maturity and acceleration in popularity, the impact on data centers is serious and will impart an incredible level of strain on the future of the data center architecture. Five negative impacts have been outlined in this article, with one alleviation being the implementation and design of very high-performance polymer optical modulators, which have already demonstrated a capability to modulator light faster, reduce power consumption and be available in a tiny footprint the size of a grain of salt. Good read!

[Read More]

Discussing PostgreSQL: What changes in version 16


Tags mysql json cio database web-development

The article discusses the new features and improvements in PostgreSQL 16, the latest version of the open source relational database. By Amit Kapila.

The article covers the following topics:

  • The performance enhancements in query execution, bulk loading, and logical replication, which include more query parallelism, CPU acceleration, and load balancing.
  • The new SQL/JSON syntax, which allows users to query and manipulate JSON data using various operators and functions.
  • The new access control rules for managing policies across large fleets of PostgreSQL instances, which enable users to define fine-grained permissions for different roles and contexts.

The article also shares some insights on the future plans and directions for PostgreSQL development, such as supporting more languages and frameworks, enhancing security and reliability, and improving documentation and community engagement. Nice one!

[Read More]

Working with Postgres JSON query


Tags mysql json microservices database web-development

The article explains how to work with Postgres JSON Query, which is a feature that allows you to store and query JSON data in PostgreSQL. By Pratibha Sarin.

It covers the following topics:

  • What is JSON data and why store it in PostgreSQL
  • What are the differences between JSON and JSONB data types
  • What are the advantages of Postgres JSON Query
  • How to create, insert, query, and manipulate JSON data using various operators and functions
  • How to work with Postgres JSONB Query, which is a more advanced version of JSON Query that supports nested arrays and objects

You can have the best of both worlds by storing & querying JSON/JSONB data in your PostgreSQL tables. Postgres JSON Query offers you the adaptability and effectiveness of a NoSQL database combined with all the advantages of a relational database. Good read!

[Read More]

How to use MailHog to test emails locally (step-by-step guide)


Tags tdd programming microservices agile web-development

MailHog is an open source email testing tool that allows developers to test their email sending and receiving capabilities more efficiently. It is a lightweight and easy-to-use tool that can be run on multiple operating systems, including Windows, Linux, FreeBSD, and macOS. By Salman Ravoof.

MailHog works by setting up a fake SMTP server on the developer’s local machine. The developer can then configure their web application to use MailHog’s SMTP server to send and receive emails. This allows the developer to test their email functionality without having to send emails to a real server.

MailHog provides a number of features that make it a valuable tool for developers, including:

  • Easy to set up and use: MailHog can be installed and configured in just a few minutes. It does not require any external dependencies, and it can be run on any operating system that supports Go.
  • Web-based interface: MailHog provides a web-based interface that allows the developer to view all of the emails that have been sent and received by MailHog. The interface also allows the developer to search for emails and to view the contents of emails.
  • SMTP server emulation: MailHog emulates a real SMTP server, so the developer can test their email functionality without having to send emails to a real server. This can help to prevent problems with spam filters and blacklists.
  • SMTP server logging: MailHog logs all of the SMTP traffic that it handles. This can be helpful for troubleshooting problems with email delivery.

Overall, MailHog is a powerful and versatile tool that can help developers to test their email functionality more efficiently. Good read!

[Read More]

Google Cloud Next 2023 FinOps product announcements recap


Tags cloud cio management fintech

Google Cloud recently announced a number of new FinOps features and enhancements at Google Next ‘23. These new features and enhancements are designed to help organizations optimize their cloud costs and get the most value from their cloud investment. By Sarah McMullin.

One of the most significant announcements was the launch of the FinOps Hub. The FinOps Hub is a central place where organizations can manage all of their FinOps activities. It provides a single view of cloud costs, recommendations for cost optimizations, and tools to help organizations implement those recommendations.

Google Cloud also announced a number of other new FinOps features and enhancements, including:

  • Cost Budgets for project users: This new feature allows organizations to set budgets for individual project users. This can help organizations to control cloud costs and identify users who are overspending.
  • Committed Use Discount recommendations: This new feature uses machine learning to recommend the right type and size of CUD for each workload. This can help organizations to save money on their cloud costs by taking advantage of CUDs.
  • Pricing API: This new API allows organizations to retrieve Google Cloud pricing data in real time. This data can be used to develop custom cost optimization tools and to make informed decisions about cloud pricing.

The new FinOps features and enhancements announced at Google Next ‘23 are a significant step forward for Google Cloud’s FinOps capabilities. These new features and enhancements will help organizations to optimize their cloud costs and get the most value from their cloud investment. Interesting!

[Read More]