Coding Languages Concepts

Coding Languages Concepts

Writing code (or coding or programming or developing or scripting) is the way humans get computers to do things. Whether it is to solve a problem, automate some tasks, or create a video game, together humans and code can do it all.

Coding is both a technical and a creative exercise. There’s no single right way of doing a task however some ways are better than others. When you begin coding, you will have moments of unparalleled frustration followed swiftly by moments of extreme satisfaction. It can be huge fun. It can be a drag. Writing code really is what you make of it.

So enough explanation for now. You’ll learn quickest by actually getting started. Read on for an introduction to writing code.

Just quickly before we jump in, there are a few terms you should know.

Libraries – These are add-ins for a language. You generally download them and then use them in your code. They are pre-made collections of commands/functions/routines that will make your code quicker and easier to write. For example, you could use the Scikit-learn in Python to automatically create machine learning models for you rather than having to write them from scratch yourself.

Syntax – The set of rules that defines how a particular language is written is called syntax. It will dictate which words and symbols you use, and in what order. It is essentially the grammar of the coding world. Once you have a go at writing code, you’ll quickly realise that thanks to the syntax, computers can’t read typos. If even one comma is in the wrong place, your code will fail! But don’t worry; most languages are designed to help you find any mistakes you make.

IDE – Stands for integrated development environment. It is software into which you will actually type out your code and then run/test what you’ve written. The best IDEs will also help you find errors and edit your code quickly. When it comes to picking an IDE, it’s best to pick something common and simple. And if you have a recommendation from a tutorial or friend, just go with that. Only thing to look out for is ensuring your chosen IDE accepts the language you want to use.

Note– here we are not addressing HTML and CSS since technically, these are not programming languages. They are page structure and style information.

Common Languages

There are a huge number of programming languages available. Some are super popular. Some are extremely niche. And many are very similar to one and another. It is difficult to decide exactly which is the most used or most useful language, so here we are going to briefly discuss all the big players.


Python is a powerful, accessible, versatile, in-demand, and incredibly popular programming language. It is used by data scientists, web developers, backend engineers, researchers, and so many more. If you want to learn a programming language and don’t really care which one, go for Python.

Python focuses on making life easy for the developer by having a relatively simple syntax. This makes it especially great for first-time coders. It also has a ridiculous number of libraries available covering everything from plotting graphs to running machine learning algorithms to sending Facebook messages. There’s also plenty of job demand for Python developers.

Whilst Python is used loads in every area of tech, if you’re interested in anything data science related, Python should be especially high on your to-learn list.


javascript logo

Frequently used alongside HTML and CSS in web development, JavaScript (sometimes shortened to just JS) is another very well known and fairly accessible language. It is not too dissimilar to Python in many ways, although the syntax is quite different.

You probably won’t want to use JavaScript if you’re in a data science or big data role. However if you’re looking at web/app design, game design, or front-end engineering, JavaScript will be a major asset.

A word of warning- JavaScript is very different to Java! Don’t get them confused.


sql logo

SQL is a slightly weird language in that it is not a general purpose programming language. It is used to lookup (usually called querying) and manage data in databases. Specifically, relational databases (also known as SQL databases).

You can learn the basics in 15 minutes since the syntax is very human-readable. Unless you have a specific reason to become very good at SQL, we’d generally recommend spending more time on another language. It is listed here just because of how widely used it is.


java logo

Java is the first of our slightly meatier programming languages. It is fairly old (and therefore often found in legacy applications in massive companies such as banks) and becoming a little less popular each year.

The main difference between Java and languages such as Python or JavaScript is that Java is statically typed and Python/JS are dynamically typed. Don’t worry too much about the technical definition of these terms if you’re just getting started. Just remember that it means Java is more fussy. As in, it generally takes more typing to do the same things and is a little less human-friendly. The plus side is that it gives you more control over your development and it generally runs faster.

C / C++ / C#

c-c-plus-plus-c-sharp logo

This is where things get a little more complicated both in terms of writing code in these languages and understanding the difference between them. C came first and C++ (pronounced “C plus plus”) and C# (pronounced “C sharp”) were built as extensions to it. Overall, we would not recommend these languages to beginners since, like Java, they are more complicated and therefore difficult to learn.

If you do want to have a go at one of them, we’d suggest doing your own research into which will be most useful. But if you do so and are still unsure, we’d say go for C++. It’s a overall a better version of normal C and is more low-level than C#. This means it is quite a challenge to learn but you’ll find it easier to transition to C# should you need to (going from C# to C++ is much harder).

C++ has similar precision and speed benefits to Java. It offers full control over the hardware and so is gathering popularity with the rise of GPU (as opposed to CPU) and cloud computing.