Get Maximum Throughput
The developers at Amazon are working on optimizing their database query times. There are n host servers, where the throughput of the ith host server is given by host_throughput[i].
These host servers are grouped into clusters of size three. The throughput of a cluster, denoted as cluster_throughput, is defined as the median of the host_throughput values of the three servers in the cluster. Each server can be part of at most one cluster, and some servers may remain unused.
The total system throughput, called system_throughput, is the sum of the throughputs of all the clusters formed. The task is to find the maximum possible system_throughput.
Note: The median of a cluster of three host servers is the throughput of the 2nd server when the three throughputs are sorted in either ascending or descending order.
Complete the function getMaxThroughput in the editor.
getMaxThroughput has the following parameter:
int host_throughput[n]: an array denoting the throughput of the host servers
Returns
long: the maximum system_throughput
π³π ππΈ Couldnβt do this without da best spike!ππ
1Example 1
n = 6, and the host throughput is given by host_throughput = [4, 6, 3, 5, 4, 5].
The maximum number of clusters that can be formed is 2.
One possible way to form the clusters is to select the 1st, 2nd, and 3rd host servers for the first cluster, and the 4th, 5th, and 6th host servers for the second cluster. The cluster_throughput of the first cluster [4, 6, 3] will be 4 (the median), and the cluster_throughput of the second cluster [5, 4, 5] will be 5 (the median).
Thus, the system_throughput will be 4 + 5 = 9.2Example 2
Constraints
Limits and guarantees your solution can rely on.
1 β€ n β€ 2 * 10^51 β€ host_throughput[i] β€ 10^9