Categories: Articles

why Blender is so small in size.

Is you are a current user of Blender or if you are new to 3d, you probably thought about the size of Blender’s installation files. And how come a very good 3d software that can do virtually everything that the other 3D software can do is only about %10 of their size. in This this video, we will try to give you a general idea about why Blender is still relatively small in size while the other 3d packages keep growing fast.

Usually, open-source development is not driven by features marketing to keep current clients or to impress and bring in more potential customers, instead, it is driven by a desire to create solid and elegant code that provides a tool to those who use it day today and they’re doing to leverage it to its full extent, with little to no attention paid to whether it looks slick to the marketing director or to impress shareholders.


Way back in the older versions of Maya and 3dsmax, the whole installed folder would be around 200-500 MB maximum.but this is not the case anymore because over the years we have seen a very noticeable difference in 3d software size. Open-source software generally speaking is very lightweight and portable because there is an interest to make a clean code, as opposed to commercial software where the focus is on features and user experience.

In order to truly understand why Blender is way smaller in size compared to the popular 3d packages such as 3ds Max, Maya, and cinema 4d we need To have a better understanding of the constant increase in size of these 3d software over the years.


Developers don’t remove or Clean old code

products are competing with each other for users and customers.

Competition means that the program vendors need to add more features, more capabilities; more of this, more of that. And over time, all of that means more code to implement all of those features and capabilities and bug fixes including the patches that you get from time to time.

Companies rarely remove code or features. Over time, additions accumulate and make things bigger.

Now, none of this happens overnight. It doesn’t even happen over a year. But over the span of a few years, you’ll start to notice that the program that used to be this big and had a certain requirement, is now that big and runs either exceptionally slow on your old hardware, or now requires more hardware to do its job than it used to.



too many libraries

the main reason is that today there are tons and tons of libraries out there that developers can use in their applications, and we have developed a culture of using libraries to avoid the constant re-invention of the wheel. Some of the reasons why software such as 3ds Max or Maya have too many libraries is that Programmers and software developers can potentially use libraries for these reasons:

They can include or add another library if it is going to be used by only one of the functions.

They can include another library if they only need a tiny subset of the entire wealth of functionality offered by that library.

They can include another library if its inclusion will only save them from extra 2 or 3 days of work.

They can include multiple libraries that serve more or less the same purpose just because different programmers on the payroll happen to already be familiar with different libraries.

of course, these are just potential reasons for excessive use of libraries in large 3d software, and it does not mean that this has to be always the case or that developers are using external code libraries for these reasons all the time.

Over the years industry standard 3d packages have accumulated many external code libs. this eventually added to the size because New features and improvements in these products are usually stuff layered on top of the core application, which is why they keep growing in size and load up slowly. It’s the fastest way but not the most efficient to get stuff out the door before the deadline. Extra plugin functionality in Blender is normally in the form of un-compiled Python scripts which are essentially just text files. External libraries are where the bulk of the file size comes in. and Blender doesn’t ship with any material/texture libraries.. since its all available as free downloads! But Proprietary software like 3ds Max have to include these in the purchase.


People want more features


Programs that continue to grow and innovate get used by more people than programs that stagnate.

It’s true that not everybody wants more and more. But the market speaks with its wallet and its downloads. Programs that do more and more are more popular. More people buy them; more people download them. That’s the measure that the program vendors use.

Programs that continue to grow and innovate get used by more people than programs that stagnate.

More people certainly doesn’t mean everyone, and it may not mean you. But in many cases it does mean the majority and often the vast majority. Companies that don’t continue to improve their products with features and functionality eventually die away. Speed might be a feature that sells somewhat, but size rarely is.


when trying to decide which application to use among several choices, some users think that the one which occupies more space will be more feature-packed, and will be a more capable piece of software. Which is of course not true.

Generally speaking, software can use a lot of space I mean up around 10 gigabytes these days, but not to an alarming extent. If we look at what takes up most space on our drives, for most of us the answer is that it is not applications, but media such as videos, movies, and video games for the most part. Software has not been bloating at the same rate that storage capacity has been expanding, and it is unlikely that it ever will, so in the future applications are likely to represent a negligible fraction of the storage space available to users.


computers are becoming stronger & faster

Another contributing factor that can contribute to the increase of 3d software size is the march of technological progress. Machines sold today are significantly more capable than the machines sold just five or ten years ago. An average new computer that 3d artists used today needs to have around 16 GB of RAM and the potential to upgrade to 64, in addition to aroud 1 terabyte of storage space in the hard drive. Ten years ago, we were happy with a few gigabytes of space and never dreamed of having that much RAM!

With hard disks in the terabytes and RAM measured in multiple gigabytes, software developers often choose to develop to the latest and greatest systems, or at least the current ones, to enable them to provide as much functionality as they possibly can in the future. Older, less capable machines, are a shrinking market by definition, so it’s really difficult to justify developing software with those machines in mind.
So that leaves those of use with older, less capable machines in a difficult spot.

On top of that, there is the general tendency to spend less time optimizing things for space and more on introducing new features. This is a natural side-effect of larger, faster, cheaper computers for everyone. Yes, it would be possible to write programs that are as resource-efficient as they were in 1990s, and the result would be stunningly fast and slick. But it wouldn’t be cost-effective anymore; creating a good software will take ten years to complete, by which time the requirements would have completely changed. People used to program with extreme attention to efficiency because the old slow and small machines forced them to, and everyone else was doing it as well. As soon as this changed, the bottleneck for program success shifted from being able to run at all to running more and more shiny things, and that’s where we are now.


The Increase demand for visual quality


One reason is that the data packaged within applications are larger because they are of higher resolution and quality. An icon back in the days of Netscape was at most 32×32 pixels, with at most 8-bit depth, (possibly only 4,) while now it is probably something like 64×64 and it is in true color with transparency, meaning 32-bit depth. That’s 16 times larger. And space is so cheap that people often do not even bother checking the “compressed” option when generating a PNG.

Also applications nowadays carry a mind-boggling amount of data with them, which older applications did not. There exist applications today that get shipped together with a “getting started” presentation video.

In addition to that programming languages, today tend to go together with rich run-time environments, which are fairly large, to the tune of 100MB each. Even if you do not use all of the features of your run-time environment, you still have to package the whole thing with your app.

Why writing clean code is important for Blender


Programmers and Blender users who know how to code and want to contribute Blenders development need to be able to read and comprehend your code — programmers that will come along after you have to be able to easily the code to be able to even be motivated to contribute.

Without readability and comprehensibility, you cannot easily reuse your code. You will be forced to reinvent the wheel many times over if you decide it is easier to write from scratch than use what you have already written when that would otherwise serve perfectly well. When writing open source software, the same problem applies to other people. Even worse when writing open source software, if your code is unreadable or — once read — incomprehensible, nobody else will bother looking at the code very much; you will not get any feedback on it other than (perhaps) complaints about that, you will not get code contributions, and ultimately your “open source” will be so opaque as to be effectively closed. It may be easier for others to just rewrite the needed functionality and put your project “out of business”.

Also Bugs are easier to fix when you can understand your own code. Features are easier to add, too.

inspirationTuts

Recent Posts

The Future of Indie Games: Leveraging AI for Procedural Content Generation

Introduction Indie games development has always been a realm of creativity and innovation. With limited…

1 day ago

Advanced Texturing Techniques: Bringing Video Game Environments to Life

Introduction Creating immersive and visually stunning video game environments is an art form that relies…

2 weeks ago

Biometric Gaming: How Biofeedback is Shaping Player Experiences

Introduction Biometric gaming is an emerging frontier in the gaming industry, leveraging biofeedback to create…

2 weeks ago

Harnessing AI in Game Design: Creating Smarter NPCs and Dynamic Worlds

Introduction Artificial Intelligence (AI) is revolutionizing the gaming industry, enhancing the complexity and depth of…

3 weeks ago

Play-to-Earn Models: The Fusion of Blockchain and Gaming

Introduction The gaming industry is experiencing a transformative shift with the integration of blockchain technology.…

3 weeks ago

Sustainable Game Development: Eco-Friendly Practices and Green Tech

Introduction As the gaming industry grows, so does its environmental impact. Sustainable game development has…

4 weeks ago