Hello Guys, How are you all? Hope You all Are Fine. Today I get the following error **ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT** in Python. So Here I am Explain to you all the possible solutions here.

Without wasting your time, Let’s start This Article to Solve This Error.

Table of Contents

## How ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT Error Occurs?

Today I get the following error **ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT** in Python.

## How To Solve ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT Error ?

**How To Solve ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT Error ?**To Solve ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT Error

**ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT**To Solve ConvergenceWarning: lbfgs failed to converge (status=1): STOP: TOTAL NO. of ITERATIONS REACHED LIMIT Error

## Solution 1

The *warning* means what it mainly says: Suggestions to try to make the *solver* (the algorithm) converges.

`lbfgs`

stand for: “Limited-memory Broyden–Fletcher–Goldfarb–Shanno Algorithm”. It is one of the solvers’ algorithms provided by Scikit-Learn Library.

The term *limited-memory* simply means it stores **only a few** vectors that represent the gradients approximation implicitly.

It has better **convergence** on relatively *small* datasets.

But what is *algorithm convergence*?

In simple words. If the error of solving is ranging within very small range (i.e., it is almost not changing), then that means the algorithm reached the solution (*not necessary to be the best solution as it might be stuck at what so-called “local Optima”*).

On the other hand, if the error is * varying noticeably* (

*even if the error is relatively small [like in your case the score was good], but rather the differences between the errors per iteration is greater than some tolerance*) then we say the algorithm did not converge.

Now, you need to know that Scikit-Learn API sometimes provides the user the option to specify the maximum number of iterations the algorithm should take while it’s searching for the solution in an iterative manner:

LogisticRegression(... solver='lbfgs', max_iter=100 ...)

As you can see, the default solver in LogisticRegression is ‘lbfgs’ and the maximum number of iterations is 100 by default.

Final words, please, however, note that increasing the maximum number of iterations does not necessarily guarantee convergence, but it certainly helps!

## Solution 2

If you are getting the following error for any machine learning algorithm,

ConvergenceWarning:

lbfgs failed to converge (status=1):

STOP: TOTAL NO. of ITERATIONS REACHED LIMIT.

increase the number of iterations (max_iter) or scale the data as shown in *6.3. Preprocessing data*

Please also refer to the documentation for alternative solver options: *LogisticRegression()*

Then in that case you use an algorithm like

from sklearn.linear_model import LogisticRegression log_model = LogisticRegression(solver='lbfgs', max_iter=1000)

because sometimes it will happen due to iteration.

**Summery**

It’s all About this issue. Hope all solution helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which solution worked for you? Thank You.

**Also Read**