Description
Solutions
Submission
Task Scheduling
🤘 INTERN

The data analysts of Hackerland want to schedule some long-running tasks on remote servers optimally to minimize the cost of running them locally. The analysts have two servers, a paid one and a free one. The free server can be used only if the paid server is occupied.

The nth task is expected to take time[n] units of time to complete and the cost of processing the task on the paid server is cost[n]. The task can be run on the free server only if some task is already running on the paid server. The cost of the free server is 0 and it can process any task in 1 unit of time.

Find the minimum cost to complete all the tasks if tasks are scheduled optimally.

Function Description:

Complete the function getMinCost in the editor.

The function getMinCost has the following parameter:

  • int cost[n]: the costs of scheduling the tasks on a remote server
  • int time[n]: the times taken to run the tasks on a remote server

Return:

int: the minimum cost to complete all the tasks

Possible solution hint: Need an O(n2) DP, in the form of DFS to judge one by one, which is equivalent to traversing all the cases ☕️ :)

Example 1:

Input:  cost = [1, 1, 3, 4], time = [3, 1, 2, 3]
Output: 1
Explanation:
The first task must be scheduled on the paid server for a cost of 1 and it takes 3 units of time to complete. In the meantime, the other three tasks are executed on the free server for no cost as the free server takes only 1 unit to complete any task. Return the total cost, 1.

Example 2:

Input:  cost = [1, 2, 3, 2], time = [1, 2, 3, 2]
Output: 3
Explanation:
The first and the second tasks can be scheduled on the paid server and in parallel. The last two can be scheduled on the free server. The total cost is 1 + 2 = 3.
Constraints:
    • 1 ≤ n ≤ 105
    • 1 ≤ cost[i] ≤ 106
    • 1 ≤ time[i] ≤ 107
Thumbnail 0
Testcase

Result
Case 1

input:

output: