Tensorflow, crisis!It is Google himself

Author:Quantum Time:2022.06.20

Xiaoxiao Feng color hair from the quantity of the concave non -Temple | Public account QBITAI

TensorFlow, which has harvested nearly 166,000 STAR and witnessed the rise of deep learning, has a storm.

And this time, the impact does not come from the old opponent Pytorch, but the rookie JAX.

In the latest wave of AI circles, Jeremy Howard, the founder of Fast.ai, said:

JAX is gradually replacing TensorFlow, which is well known. Now it happens (at least inside Google).

Lecun even believes that fierce competition between deep learning framework has entered a new stage.

Lecun said that the TensorFlow of Google was indeed more popular than TORCH. However, after the appearance of Meta's PyTorch, its popularity has now exceeded TensorFlow.

Now, including Google Brain, DeepMind, and many external projects have begun to use JAX.

The typical example is the recently explosive Dall · E Mini. In order to make full use of the TPU, the author uses JAX to program. Someone sighed after using:

This is much faster than PyTorch.

According to the "Business Insider", it is expected that in the next few years, JAX will cover all Google's products using machine learning technology.

From this point of view, it is more like a "self -rescue" initiated by Google on the framework.

Where does JAX come from?

About JAX, Google is actually prepared.

As early as 2018, it was brought out by a small team of Google's brain.

The research results were published in papers entitled by compiing machine learning program via high-level tracing:

JAX is a Python library for high -performance numerical calculations, and deep learning is only one of the functions.

Since its birth, its popularity has been rising.

The biggest feature is fast.

Feel a example.

For example, the sum of the first three powers of the matrix, and the calculation is about 478 milliseconds.

It only takes 5.54 milliseconds to use JAX, 86 times faster than Numpy.

Why is it so fast? There are many reasons, including:

1. Nighty accelerator. Needless to say, Numpy's importance is to use Python to engage in scientific computing and machine learning. No one is separated from it, but it has not supported hardware acceleration such as GPU.

The calculation function API of JAX is based on Numpy, which allows the model to easily run on the GPU and TPU. This point has taken a lot of people.

2. XLA. XLA (Accelerated Linear Algebra) is an accelerated linear algebra, an optimized compiler. JAX is based on XLA, which greatly increases the upper limit of JAX calculation speed.

3. JIT. Researchers can use XLA to convert their functions into real -time compilation (JIT) versions, which is equivalent to adding a simple function modifier to the calculation function, which can increase the calculation speed of several magnitude.

In addition, JAX is completely compatible with Autograph and supports automatic differences. It supports reverse mode and forward mode micro -division through functions such as Grad, Hessian, JACFWD, and JACREV, and the two can be composed of arbitrarily.

Of course, JAX also has some shortcomings on the body.

for example:

1. Although JAX is known for a accelerator, it does not fully optimize each operation in the CPU calculation.

2. JAX is still too new, and does not form a complete basic ecology like TensorFlow. Therefore, it has not been launched by Google in the form of molding products.

3. The time and cost of Debug required, and the "side effects" are not completely clear.

4. Do not support the Windows system, but can only run in the virtual environment above.

5. Without a data loader, you have to borrow TensorFlow or PyTorch.

Native

Nevertheless, simple, flexible and easy to use JAX is the first to be popular in DeepMind. Some deep learning libraries born in 2020 Haiku and RLAX are based on it.

This year, Adam Paszke, one of the original authors of PyTorch, also joined the JAX team full -time.

At present, JAX's open source projects have 18.4K stars on GitHub, which is much higher than TensorFlow.

It is worth noting that during this period, many voices said it was likely to replace TensorFlow.

On the one hand, because of the strength of JAX, on the other hand, it is mainly related to many reasons for TensorFlow itself.

Why does Google switch to JAX?

Tensorflow, which was born in 2015, was once popular. After launch, it soon surpassed Torch, Theano, and Caffe and other "tide" to become the most popular machine learning framework. However, in 2017, the new PyTorch "rolled up".

This is the machine learning library built by META based on Torch. Due to its simple and easy -to -understand, it is quickly favored by a lot of researchers, and even has a trend that exceeds TensorFlow.

In contrast, TensorFlow has become more and more bloated in frequent updates and interface iterations, and gradually loses the trust of developers.

(Judging from the proportion of questions on Stack Overflow, PyTorch has risen year by year, but TensorFlow has been stagnating.)

In the competition, the shortcomings of TensorFlow have gradually exposed. The problems of API unstable, complexity, and high learning costs have not been resolved as the update, but the structure has become more complicated.

In contrast, TensorFlow has not continued to play the advantages of "operating efficiency" that can be played.

In the academic world, the usage of PyTorch is gradually surpassing TensorFlow.

Especially in major top -ups such as ACL and ICLR, the algorithm framework implemented by PyTorch has accounted for more than 80%in recent years. In contrast, TensorFlow's utilization rate is still declining.

It is for this reason that Google can't sit still and try to use JAX to recapture the "dominance" of the machine learning framework.

Although JAX is not nominally "a universal framework for deep learning", from the beginning of the release, Google's resources have been tilting towards JAX.

On the one hand, Google's brain and DeepMind gradually built more libraries on JAX.

Including Google's Trax, FLAX, JAX-MD, and DEEPMIND's neural network library Haiku and enhanced learning library RLAX are based on JAX-based.

According to Google officially said:

In JAX ecosystem development, it will also consider ensuring that it is consistent with the design of the existing TensorFlow library (such as Sonnet and TRFL).

On the other hand, more projects have also begun to achieve JAX. The recently explosive Dall · E Mini project is one of them.

Because it can better use the advantages of Google TPU, JAX is much better in running performance than PyTorch. More industrial projects built on TensorFlow have also switched to JAX.

Some netizens even ridiculed JAX's reason why it is now: It may be the user of TensorFlow's users who can't stand this framework.

So, is JAX hoping to replace TensorFlow and become a new force against PyTorch?

Which framework is more optimistic?

Overall, many people still stand in PyTorch very firmly.

They don't seem to like Google's speed every year.

"Although JAX is very attractive, it is not enough to have the ability to" revolutionary "to promote everyone to abandon PyTorch to use it."

But it is not a minority who is optimistic about JAX.

Some people say that PyTorch is perfect, but JAX is also narrowing the gap.

Some people even called JAX crazy, saying that it was 10 times more powerful than PyTorch, and said that if Meta does not continue to work hard, Google will win. (Manual dog head)

However, there are always people who do n’t care, who will lose, and their eyes are long -term:

No best, only better. The most important thing is that more players and good IDEA have joined in, so that open source and truly excellent innovative paintings are equal.

- END -

49 -year -old Luo Yonghao, quietly bid farewell to the Internet!

Luo Yonghao, the first net celebrity, just retired from the net!From Lao Luo quote...

The toughness of China's economy, "instant" appearing

At the moment when gradually overcoming the adverse impact of the epidemic and the...