The twin parametric-margin support vector machine (TPMSVM) is an excellent kernel-based nonparallel classifier. However, TPMSVM was originally designed for binary classification, which is unsuitable for real-world multiclass applications. Therefore, this paper extends TPMSVM for multiclass classification and proposes a novel K multiclass nonparallel parametric-margin support vector machine (MNP-KSVC). Specifically, our MNP-KSVC enjoys the following characteristics. (1) Under the “one-versus-one-versus-rest” multiclass framework, MNP-KSVC encodes the complicated multiclass learning task into a series of subproblems with the ternary output {−1,0,+1}. In contrast to the “one-versus-one” or “one-versus-rest” strategy, each subproblem not only focuses on separating the two selected class instances but also considers the side information of the remaining class instances. (2) MNP-KSVC aims to find a pair of nonparallel parametric-margin hyperplanes for each subproblem. As a result, these hyperplanes are closer to their corresponding class and at least one distance away from the other class. At the same time, they attempt to bound the remaining class instances into an insensitive region. (3) MNP-KSVC utilizes a hybrid classification and regression loss joined with the regularization to formulate its optimization model. Then, the optimal solutions are derived from the corresponding dual problems. Finally, we conduct numerical experiments to compare the proposed method with four state-of-the-art multiclass models: Multi-SVM, MBSVM, MTPMSVM, and Twin-KSVC. Experimental results demonstrate the feasibility and effectiveness of MNP-KSVC in terms of multiclass accuracy and learning time.