You are given two 0-indexed integer arrays nums1 and nums2 of equal length n and a positive integer k. You must choose a subsequence of indices of length k from nums1 and nums2.
The score of the chosen subsequence is defined as the sum of the selected elements from nums1 multiplied by the minimum of the selected elements from nums2.
Return the maximum possible score.
A subsequence of indices is a set of distinct indices chosen from [0, 1, ..., n - 1]. The elements selected by those indices from nums1 and nums2 are used to compute the score.
Input: nums1 = [1,3,3,2], nums2 = [2,1,3,4], k = 3
Output: 12
Explanation: Selecting indices 0, 2, and 3 gives nums1 values [1,3,2] (sum = 6) and nums2 values [2,3,4] (min = 2). Score = 6 * 2 = 12.
Input: nums1 = [4,2,3,1,1], nums2 = [7,5,10,9,6], k = 1
Output: 30
Explanation: Selecting index 2 gives nums1 value [3] (sum = 3) and nums2 value [10] (min = 10). Score = 3 * 10 = 30.
Input: nums1 = [2,1,14,12], nums2 = [11,7,13,6], k = 3
Output: 168
Explanation: Selecting indices 0, 2, and 3 gives nums1 values [2,14,12] (sum = 28) and nums2 values [11,13,6] (min = 6). Score = 28 * 6 = 168.
n == nums1.length == nums2.length1 <= n <= 10^50 <= nums1[i], nums2[i] <= 10^51 <= k <= n