ANN is a very well-known approach used for classification based on supervised machine learning. This approach faces some issues, notably the local minima problem, which leads to diminished accuracy in the results. To solve the problem of local minima, a new algorithm called RPSOGAC has been proposed. The proposed algorithm combines the strengths of both optimization algorithms PSO and GA to improve the classification accuracy of ANNs. RPSOGAC starts by finding the best weights that lead to the best ANN classification result using the backpropagation algorithm and adds these weights to the initial population. The other individuals of the population are randomly generated. Based on randomness, the algorithm reciprocally and continually switches between applying the GA and PSO algorithms until it reaches to the best solution. Two major differences between RPSOGAC and other previous algorithms, firstly is the random selection of GA and PSO, which gives equal opportunities for them to improve the classification. Secondly, during PSO, a competition between two population sets are performed to come up with a new population having the best individuals, this gives a chance for expected to improve individuals to enhance in the future if possible. Various experiments on six different datasets related to four domains have been conducted to show the classification accuracy of RPSOGAC. Also, a comparative study has been performed to compare the accuracy performance of classification between RPSOGAC and other algorithms. The obtained results show that RPSOGAC outperforms other approaches in four datasets and in the other two the results are very close. © 2019 University of Bahrain. All rights reserved.