A group acceptance sampling plan based on flexible new Kumaraswamy exponential distribution: An application to quality control reliability

Abstract In this study, we presented a group acceptance sampling plan for situations in which the lifetime of an item followed a flexible new Kumaraswamy exponential distribution for applications to quality control reliability. This study includes a detailed analysis of the operating characteristic function values, producer risk, and minimum required group size for the acceptance number, which are determined by three shape parameters. To determine the quality index, we used the median and tested various parametric values to obtain the optimized values. These optimized values are presented in the tables and graphically for a better understanding. In addition, we explain the results of our study using two real-life datasets and an example. Comparison has been also made between GASP and OSP.


Introduction
In recent years, there has been a growing focus on improving, measuring, and monitoring the quality of products, services, and procedures.This trend has been driven by the recognition that there is a strong link between productivity, reputation, quality, and confidence in a brand's image.As a result, companies across various industries have been investing in quality management systems, process improvement initiatives, and customer feedback mechanisms to ensure that they are delivering high-quality products and services that meet the needs and expectations of their customers.Nowadays companies consider the implementation of statistical quality control (SQC) procedures and decisive importance for enhancing their competitiveness in the market.However, quality control (QC) has evolved from its original definition, which primarily involves adjusting production to a standardized model to meet customer requirements.It now extends beyond manufacturing processes and is applied across various industrial and service sectors.These methods are employed to oversee and regulate the quality of the products, services, and procedures and involve statistically analyzing the collected data during production or service delivery to identify and eliminate sources of variability and ensure consistent adherence to quality standards.SQC also utilizes the statistical tools and methodologies to analyze data and make informed decisions based on objective evidence, rather than relying on subjective judgments (Ameeq et al., 2023).
Implementing SQC techniques offers several benefits to organizations.First, it aids to identify and rectify issues and flaws early in the production or service delivery process, avoiding the production of defective products or the delivery of subpar services.By monitoring and controlling quality, companies can lessen waste, rework, and customer complaints, resulting in improved efficiency and cost-effectiveness.Additionally, SQC enables organizations to make data-driven decisions by providing valuable insights into the performance and capability of their processes.By analyzing process data, companies can identify areas for improvement, optimize resource allocation, and enhance their overall operational efficiency.This continuous improvement approach helps organizations stay competitive in the market and meet or exceed customer expectations.In addition to its impact on internal operations, SQC also contributes to external factors such as brand image, trust, and customer satisfaction.Consistent delivery of high-quality products and services builds trust among customers, enhances brand reputation, and establishes a competitive edge in the marketplace.Positive customer experiences and satisfaction lead to increased customer loyalty and repeat business, which further boost the organization's position in the industry (Chang, 2016).
In today's industrial world, it has become essential to produce high-quality products using statistical quality control (SQC) techniques.Product quality is a crucial factor in achieving business success, expansion, and competitiveness.These techniques are helpful for improving product quality in any manufacturing process, as they help to reduce process and product variabilities (Duncan, 1986).However, the success of any industry is greatly influenced by SQC, which comprises a set of operational procedures that businesses must follow to obtain certification that their products meet consumers expectations.According to Attali et al. (2013), quality dimensions, such as dependability, performance, aesthetics, features, and adherence to standards, are essential criteria for assessing a product's quality.These dimensions emerged as the most significant factors influencing consumer happiness when choosing between competing products and services (Rubmann et al., 2015).
Acceptance sampling is a statistical technique used to assess the suitability of a product batch.Essentially, it involves sampling a batch of products as a whole to determine their acceptability.The primary objective is to ensure that the batch meets specific standards, which can vary depending on the company or industry.In this method, a random sample of the available products is selected, and the sample products are tested, and the choice to accept or reject the entire batch is taken in light of the test results.While this method also helps to determine whether a product of batch should be accepted, if it does not provide an accurate estimation of the overall quality of the entire lot.Typically, the manufacturer provides the consumer with a few samples from the batch.If the number of defects in the samples falls below an acceptable threshold, the consumer approves the entire lot.In the context of inspecting products produced in small batches, a solitary sampling approach entails picking a sample from the batch and subjecting it to testing in order to determine if it satisfies specific quality standards.Essentially, the aim is to verify whether the defective items fall within the acceptable limit, and if the batch fails to meet the established criteria, the entire lot is deemed unacceptable and rejected.In the context of quality control, a dual sampling approach entails selecting two samples from a lot and evaluating them to determine if they meet a preestablished quality standard.This method involves the use of two acceptance numbers.If the number of defective pieces in the first sample is lower than the smallest acceptance number, the entire lot is accepted.Conversely, if the number of defective pieces exceeds the largest acceptance number, the lot is rejected.When the number of defective pieces falls between the first and second acceptance numbers, a second sample is drawn.The final decision regarding acceptance or rejection is based on whether the combined number of defective pieces from both samples surpasses the second acceptance number.Multiple sampling refers to the use of more than two samples in making a decision.Sequential sampling, for instance, involves obtaining several samples.Once the group of samples is collected, a test is conducted to assess whether it meets a predefined quality criterion.If the samples do not surpass the threshold limit, the process is repeated (Muneeb Hassan et al., 2023;Vlcek et al., 2004).
The development of accelerated life-testing methods has been a subject of interest for many years.Researchers have proposed various approaches for estimating the reliability of products under accelerated conditions.For instance, Epstein (1954) assumed that an item's lifespan follows an exponential distribution and created an accelerated simulation process (ASP) based on abbreviated life tests.Meanwhile, Goode and KAO (1961) proposed sampling strategies and reliability testing procedures for abbreviated life tests under the premise that the lifetime of a device follows the Weibull distribution.Similarly, Gupta (1962) used normal and log-normal distributions to derive the sampling plans for abbreviated life tests.In recent years, new probability density functions have been introduced to improve the accelerated life testing methods.For instance, Rosaiah et al. (2009) presented a half-logistic distribution as a new PDF in the field of ASP.In addition, when a test is terminated at a predetermined time, Kantam et al. (2001) investigated the issue of ASP and proposed a method based on truncated life tests.Baklizi (2003) suggested using ASP based on truncated life tests for the Pareto distribution of the second kind, assuming that the form parameter is known.Finally, Aslam et al. (2010) developed an ASP for a generalized exponential distribution, where the life tests were shortened at a preassigned time.
Several studies on GASP have been proposed to accommodate different product lifetime distributions.For instance, when product lifetimes follow either an inverse Rayleigh or log-logistic distribution, Aslam et al. (2009b) introduced a GASP based on life tests, where multiple items can be tested simultaneously.For the Birnbaum-Saunders distribution, Balakrishnan et al. (2007) proposed an economic reliability strategy, whereas Aslam et al. (2009a) suggested GASP plans for the Weibull distribution with a given shape parameter.Jun et al. (2006) developed Weibull distributed lifetimes under sudden death.Aslam et al. (2013) developed GASP plans for both the Weibull and the generalized exponential distribution.In another study on GASP based on life tests, Aslam et al. (2009) assumed that lifetime follows a gamma distribution with well-known shape parameters.For the Marshall-Olkin Kumaraswamy exponential distribution, Almarashi et al. (2021) provided GASP plans for life tests.Rao (2009a) provided a group acceptance sampling plan based on truncated life tests for Marshall-Olkin extended Lomax distribution.
Owing to the potential to include more components, it is becoming increasingly common to create novel statistical probability distributions based on baseline distributions, G-families, and component techniques to examine the tail features of the distributions.Here, we discuss some G-families to demonstrate the validity of the current study.The G-families were suggested by Azzalini (1985) 2015) logistic-X family in recent statistical literature, and they have received more attention.For the bounded unit interval (0,1), Kumaraswamy (1980) proposed a Kumaraswamy two-parameter model, which we express here using the random variable (rv) y, Kw (λ, κ).
The kumaraswamy distribution's cumulative density function (cdf) and probability density function (pdf) of Y are and respectively.
Cordeiro and de Castro (2011) described in the Kw-G family cdf and pdf by and where λ and κ are the two extra shape parameters, while $ is the vector of baseline parameters.
The objective of this study is to design a GASP for a new Kumaraswamy exponential (NKwE) distribution.The median was used as a quality parameter in the current investigation, and Rao (2009b) indicated that for a skewed distribution, the median performs better than the mean.However, NKwE has a skewed distribution, and no work has been performed by any researcher according to the literature.The GASP for the NKwE model was designed to address certain consumer and producer risks with high quality.In addition, the minimum number of groups, acceptance number, customer risk, and test termination time are required for a particular GASP.Future investigations based on the recommended sampling method for determining nanoquality level (NQL) for goods that adhere to different probability distributions under the Nkw-G family scheme will be based on the findings of the current study.
Following is the structure of the remaining sections of this paper: • The theoretical and mathematical foundations of the NKw-G family are presented in section 2.
• Section 3 shows the details of how GAPS is structured for the lifetime percentile using an abbreviated life test.
• Section 4 comprises an explanation and example of the suggested GASP under the NKwE model.
• In Section 5, an application is performed by using two real-life datasets.
• Section 6 presents a comparison of the GASP and OSP.
• Finally, Section 7 summarises the findings of the current study.

Presentation of the NKwE distribution
In this section, we discuss the NKw-G family cdf, pdf, and qf using an exponential distribution as a baseline; its pdf, cdf, and quantile function (qf) are also described.See El-Morshedy et al. (2022) for the detailed mathematical derivation.The NKw-G family cdf is defined by where α and s are shape parameter with α, s >0; After differentiating Equation ( 5), the pdf of the NKw-G family is described by The qf of the NKw-G family expressed as: The cdf, pdf, and qf for the NKwE distribution may be laboriously established by replacing the cdf, pdf, and qf of the exponential distribution in Equation ( 5) and Equation ( 6), by using exponential distribution as a basis.That is, we consider g w ð Þ ¼ θe À θw , and The cdf and pdf for the NKwE distribution are provided by and Some possible hrf and pdf shapes for the NKwE distribution are shown in Figures 1(a,b) and Figure 2(a,b).These figures illustrate that the NKwE distribution PDF can be symmetric, reversed-J shaped and right-skewed.The plots of HRF are sowing some flexible shapes, decreasing, upsidedown bathtub-shaped, and reversed bathtub-shaped, which quantifies the lifetime distribution features, and it is used to understand the instantaneous rate of event occurrences over time, crucial in fields like survival analysis and reliability engineering.
For the reason to express the qf of the NKwE distribution, the p th qf of the exponential distribution is received as w p ¼ À 1 θ logð1 À pÞ.Thus, the p th qf denoted as w p of the NKwE distribution using Equation ( 7) is obtained as For the current study, median is taken as a quality parameter.Thus, to obtain median of the NkwE distribution can be obtained by replacing p ¼ 0:5 in Equation ( 10) and is written as:

Description of GASP under the NKwE distribution
The layout parameter of a GASP is actually received within the scenario of the NKwE distribution.The procedures for bringing the organization reputation plan into action and acquiring architectural parameters were taken from Gupta (1962) and as follows: • Creating a series of groups g and giving each one r objects.As a result, the lot sample size is n ¼ g � r.
• Choosing c as the approval number for every group at time w 0 .
• Carrying out the experiment for each of the g groups at the same time and keeping track of how many attempts each group made failed.
• Accepting the outcome if there are no more than c failures overall.
• If any group fails more than c times, then the experiment is over and the lot will be discarded.
For a given r, the suggested GASP is specified by means of two parameters ðg; cÞ.From Equation (8), the cdf of the NKwE distribution is seen to be dependent on t, α and s, and the median life of the NKwE distribution is presented in Equation (10).It is proper to decide the termination time w 0 as w 0 ¼ a 1 v 0 , where a 1 denotes a positive regular and v 0 refers to the desired existence.For instance, if a 1 ¼ 0:5, the test length is half that of the desired life, or if a 1 ¼ 3, the test length is three times that of the required lifestyles.The probability of approving lots in this situation is where "p" denotes the opportunity that an item in a set fails earlier than t 0 , and the chance of failure is derived with the aid of putting Equation ( 10) in ( 8).Based on Equation (10), we set Let, Now, substituting θ ¼ À φ=v and w ¼ a 1 v 0 in Equation ( 8), the failure probability is given by which can be expressed as When a 1 and r 2 ¼ v=v 0 are specific then it is possible to determine p for given α and s.A product's agreeable stage may be expressed using the proportion of implied lifespan to the desired lifespan of a product as v=v 0 .The only thing left to do is challenge the ensuring constraints while minimizing the g and c as minfg; c : ðg; r; cÞg, by considering the hypothesis as: The optimizing constraints are discussed below and where r 1 and r 2 denote the way ratio on the purchaser's chance and at the producer's hazard, respectively.The failing probabilities which can be utilised in the Equations ( 13) and ( 14) are expressed as and Both the above Equations ( 15) and ( 16) are extracted from Equation ( 14).

Description of GASP with example
The utilization of a sampling technique is claimed to be beneficial in terms of saving both time and money.Various sampling methods involve conducting specific quality control tests to determine whether a batch should be approved or rejected.This particular section revolves around an illustration of a GASP (grouped age sampling plan) that operates on the assumption that the distribution of an item's lifespan follows an NKwE model, with the shape parameters α and s known, and utilizes the cumulative distribution function described in Equation ( 8).In this GASP scenario, a random sample of size n is chosen, distributed and subjected to life screening for r item within each group over a predetermined time period.However, Tables 1 and 2 show the layout parameters for GASP with distinctive values of s ð1:02 and 2:0Þ.Taken into account that r has values (5 and 10).Additionally, it has been also shown that reducing customer risk broadens the variety of enterprises.Moreover, as r 2 rises, the range of companies shrinks fast.Beyond a certain factor, the number of companies and approval rates stay consistent, and the likelihood of approving a massive quantity gradually starts to decline.To represent an effect of a 1 is also proven inside the table.On the life take a look at, for example, η ¼ 0:25, a 1 ¼ 0:5, r 2 ¼ 6, s ¼ 1:02, and r ¼ 5, for eight groups 40 ð8 � 5Þ units, are required.Furthermore, for r ¼ 10, the life test calls for best agencies, i.e., 2 � 5 ¼ 10 units.As an end result, 10 organizations would be ideal in this scenario.Table 2 has a value of 2.0.Increasing the form parameter price effects in a reduced group length for the connected plan, consistent with the said data.When the real median lifestyles grow, the variety of corporations drops and the OC values PðaÞ growth for the investigated GASP when using the NKwE distribution and median lifetime as the first-class criterion.It is supplied in Table 2 for numerous parametric values (s ¼ 1:02, η ¼ 0:25, r ¼ 5 and a 1 ¼ 0:5).An example from Smith et al. ( 2023) is taken into consideration while displaying Table 3 to pass test the effects displayed in Tables 1 and 2.
Assuming that the intended lifespan for motor of clock positioned on the NKwE distribution with shape parameter s ¼ 1:02 is 4000 cycles.When the actual median lifetime is 4000 cycles as opposed to the actual prescribed lives of 8000 cycles, the producer confronts a 5% risk while the customer suffers a 25% risk.An analyst will now do a 2000-cycle test with 10 units within every group to check if the indicated lifespan motor of clock is larger than their intended lifespan.For this scenario, we have s ¼ 1:02, v 0 = 4000 cycles, a 1 ¼0, r ¼ 5, η ¼0.25, r 1 ¼ 1, producer's risk = 0.05 with r 2 ¼ 2.Moreover, in Table 2, we have g ¼ 64 and c ¼ 7. It means that 640 ð64 � 10Þ units have to be drawn, with 5 devices being allocated to each of the 64 groups.If no greater than 7 units expire in all of those companies before a thousand cycles, the implied lifestyles for the motor of clock can be statistically guaranteed to be extra than the prescribed lifestyles.If a first-rate manipulate investigator wants to test the idea that the motor of clock has a lifespan of 8000 cycles but a real average existence of four times that the investigator can test 64 companies of five different items.If no more than three items expire in 2000 cycles, the investigator will conclude that the lifespan is greater than 8000 cycles with 95% of accuracy because a 1 ¼ 0:5 and the median lifespan is 2000 cycles.Therefore, the lot below research needs to be familiar.
Figure 3 demonstrates that how the g and c values tend to decrease as the actual median lifetime increases while the operating characteristic (OC) values tend to steadily increase.Therefore, the lot under consideration will be accepted within certain periods.Accepting the lot at r ¼ 10 would be preferable in this instance for optimizing time and money, since fewer groups would be assessed than r ¼ 5 while by taking, ðaÞ s ¼ 1:25
The p-value (PV), standard error (SE), Kolmogorov-Smirnov (KS), and maximum likelihood estimate (MLEs) of the NKwE distribution are shown in Table 4.   Figure 3. Tables 1 and 2 [1] and [2] are used as the sources of some parametric values for the graphical representation of g and OC.The histogram of the records with the estimated pdf, estimated cdf, Probability-Probability (P-P) plot, Quantiles-Qunatiles (Q-Q) plot, TTT, and envisioned hrf is shown in Figure 4. Figure 4 indicates that the NKwE model has a good modelled for the survival record set.According to estimated parametric values, the planned parameters are likewise determined and are displayed in Table 5.The results of the anticipated parameters in Table 5 are shown to be consistent with the values in Tables 1 and 2.
The p-value (PV), standard error (SE), Kolmogorov-Smirnov (KS), and maximum likelihood estimate (MLEs) of the NKwE distribution are shown in Table 6.
The histogram of the records with the estimated pdf, estimated cdf, Probability-Probability (P-P) plot, Quantiles-Qunatiles (Q-Q) plot, TTT, and envisioned hrf is shown in Figure 5.
Figure 5 indicates that the NKwE model has a good modelled for the survival record set.Thus, the NKwE model gives a reasonable suit of the statistics.The plan parameters are calculated using fitted parametric values and are shown in Table 7.It can be observed that the performance of the planned parameters in Table 7 is consistent with the values in Tables 1 and 2.
The descriptive analysis of two data sets is shown in Table 8, and all values were calculated by using the R-programming language and graphically presented in Figure 6   Table 9 is showing the true median lifetimes, sample sizes, g, c, and OC values of two data sets.Considering data set I with η ¼ 0:01, r ¼ 5, a 1 ¼ 0:5, α = 1.1253 and ŝ = 3.4506 when g and c decrease and probability increase, the lot will be accepted.Similarly, considering data set II with η ¼ 0:01, r ¼ 10, a 1 ¼ 0:5, α = 2.5047 and ŝ = 12.7570 when g and c decrease and probability increase, the lot will be accepted.Here, it is cleared that for data sets I and II, the lesser groups are to be tested, and hence it will optimize cost and time.

Comparative study of GASP versus OSP
A procedure characterized as batch judgement uses ASPs to decide whether the coming or leaving batches should be admitted or denied according to a predetermined quality.The sample size and duration of investigation are the two factors that experienced professionals should consider most carefully, and both should be optimized.Although the OSP can help with this optimization, in such a case, only a single item will be evaluated at a time.In contrast, a GASP may also accomplish the optimal expense and hassle when several items can be evaluated by grouping them together.Table 10 shows the sample sizes of the GASP and OSP when η ¼ 0:25 and a 1 ¼ 0:5 for the two datasets.It is evident from Table 10 that GASP is superior to OSP as minimum items are to be tested at once, which results in optimized time and cost.

Conclusion
This study focuses on a GASP that assumes the lifespan of a product will follow the NKwE distribution.
The study emphasizes a few critical plan parameters, specifically the number of groups g and acceptance number c, which are determined by balancing the risks of producers and consumers.The proposed plan aims to maintain a specific quality level by setting a limit on the number of defective   items that can be present in a batch.As the percentile ratio (the ratio of the true average life to the stipulated life) increases in the proposed plan, the number of groups g and the acceptance number c tend to decrease, while the operational characteristics (OC) values tend to increase.The study concludes that the proposed plan is effective in ensuring that a specific level of quality is maintained while also minimizing the risks for both the producer and the consumer.By balancing the risks, the plan helps prevent the acceptance of defective items while avoiding the rejection of good products.Overall, this study contributes to the development of the GASP methodology for products with a lifespan that follows NKwE distribution.The proposed plan can be useful for manufacturers and consumers in various industries as it provides a way to ensure product quality while minimizing risk.applicability and effectiveness of GASP may vary depending on the specific industry, product characteristics, and quality requirements.Engineers should carefully assess the suitability of GASP for their particular situations and consult relevant standards and guidelines to ensure its proper implementation.

Funding
No funding was obtained for this study.

Figure
Figure 1.Different graphical representations for pdf of the NKwE distribution.

Figure
Figure 2. Different graphical representation for hrf of the NKwE distribution.
large sample length in needed cells contains hyphens (-).Table2.GASP displaying minimal g and c for α ¼ 0:8 and s ¼ large sample length in needed cells contains hyphens (-).

Figure
Figure 6.Plots of descriptive analysis for (a) data set I and (b) data set II.

Table 9 .
True median lifetimes, sample sizes, g, c, and OC values of the two data sets