这是一份美国 JHU的应用数学专业组合优化代考Combinatorial Optimization科目的代考成功案例,学生在我们的帮助下取得了满意的成绩~

数学代考|组合优化代考Combinatorial Optimization

There are 7 questions on this test.
 You have to upload your answers on Gradescope by 11:59 pm US eastern time,
if the upload does not work. No answers will be accepted after this deadline.
 You are not allowed to discuss any problem with another human being, except
the instructor and the TAs for the course.
 You can use a computer only as a word processor; in particular, you cannot consult
the internet in regards to this midterm. You CAN use any other resource like
the textbook, your notes, books from the library.
 You CAN cite any result we have mentioned in class or from the HWs without
proof. If you cite a result (e.g., from a book) that was NOT mentioned in class,
you should include a complete proof of this fact.
 The level of rigor expected is the same as the HW solutions. Make sure you
justify all your answers.
 There will be partial credit; e.g., if you do not prove part (i) of some question,
but use part (i)’s result to prove part (ii), you can get full credit for part (ii).
I attest that I have completed this exam without unauthorized assistance from any
person, materials, or device.

Problem 1.

  1. (20 pts) We say that a set $X \subseteq \mathbb{R}^n$ is bounded if and only if there exists a number $M \geq 0$ such that $X \subseteq\left{x \in \mathbb{R}^n:-M \leq x_i \leq M \quad \forall i=1, \ldots, n\right}$.

Let $P=\left{x \in \mathbb{R}^n: A x \leq b\right}$ where $A$ is an $m \times n$ matrix and $b \in \mathbb{R}^m$ and assume that $P$ is nonempty. Define the set $C:=\left{r \in \mathbb{R}^n: A r \leq 0\right}$. Show that $P$ is bounded if and only if $C={0}$
[Hint: $P$ is bounded if and only if the linear programs $\max \left{x_i: A x \leq b\right}$ and $\max \left{-x_i\right.$ : $A x \leq b}$ have finite values for all $i=1, \ldots, n$.]

First, we will show that if $P$ is bounded, then $C = {0}$. Suppose that $P$ is bounded, then there exists an $M \geq 0$ such that $P \subseteq {x \in \mathbb{R}^n : -M \leq x_i \leq M \text{ for all } i = 1,\dots,n}$. Now, suppose that $r \in C$ with $r \neq 0$. Then, $Ar \leq 0$ and there exists some $j \in {1,\dots,n}$ such that $r_j > 0$ (otherwise, $r$ would be the zero vector). Consider the vector $x = M\frac{r}{r_j}$, where the $j$-th component is $M$ and all other components are scaled accordingly to maintain the ratio. Note that $x$ is in $P$ since $Ax \leq AM\frac{r}{r_j} \leq 0$ (since $r_j > 0$). However, $x_j = M > 0$, which contradicts $P$ being bounded. Therefore, $C = {0}$.

Next, we will show that if $C = {0}$, then $P$ is bounded. Suppose that $C = {0}$ and let $x_i$ be the $i$-th component of $x \in \mathbb{R}^n$. Consider the linear program $\max{x_i : Ax \leq b}$. If this linear program is unbounded, then we can find a sequence of feasible solutions $x^k$ with $x^k_i \rightarrow \infty$. Since $C = {0}$, we must have $x^k_j < 0$ for some $j$. Now, consider the vector $y^k = \frac{x^k}{-x^k_j}$, which is in $P$ since $A y^k \leq b$ and has the property that $y^k_j = -1$ and $y^k_i \rightarrow -\infty$ as $k \rightarrow \infty$. This contradicts $P$ being bounded, so the linear program $\max{x_i : Ax \leq b}$ must be bounded. Similarly, we can show that the linear program $\max{-x_i : Ax \leq b}$ is bounded. Therefore, $P$ is bounded.

Thus, we have shown that $P$ is bounded if and only if $C = {0}$.

Problem 2.

  1. Game of Slither. The game of Slither is played on a graph $G=(V, E)$ between two players. The players, called First and Second, play alternately, with First playing first. At each step
    the player whose turn it is chooses a previously unchosen edge. The only rule is that at every step the set of chosen edges forms a simple path (i.e., a path with no repeated vertices). The loser is the first player unable to make a legal play at his or her turn.
    (i) (10 pts) Show that if $G$ has a perfect matching (a matching which leaves no vertex exposed), then First has a winning strategy.
    (ii) (10 pts) Suppose First’s first move is an edge $u v$ such that $u$ is an inessential vertex of $G$. Show that Second can now force a win. [Recall that a vertex is inessential if there exist a maximum matching that leaves this vertex exposed]

(i) If $G$ has a perfect matching, then we can partition the vertices into two sets $A$ and $B$, where every edge of the matching goes between a vertex in $A$ and a vertex in $B$. Since $G$ has a perfect matching, we have $|A| = |B|$.

First can begin by choosing any edge incident to a vertex in $A$, say $u_1v_1$. Second must then choose an edge incident to $v_1$, say $v_1u_2$. Continuing in this way, First always chooses an edge between $B$ and $A$, and Second chooses an edge between $A$ and $B$. Since $|A|=|B|$, at some point all edges of the perfect matching will have been chosen, forming a simple path. At that point, First will have made the last move and will win the game.

(ii) If First’s first move is an edge $uv$ such that $u$ is an inessential vertex of $G$, then we can find a maximum matching $M$ of $G$ that leaves $u$ exposed. Let $w$ be the neighbor of $u$ that is not matched by $M$. Second begins by choosing the edge $uw$. First must now choose an edge incident to $w$, say $wv_1$. Since $u$ is inessential, there exists an edge $ux$ in $M$ that is not incident to $w$. Second can now choose the edge $xv_2$. Continuing in this way, Second always chooses an edge in $M$ that is not incident to $v_i$, and First always chooses an edge in $G \setminus M$ that is incident to $v_i$. Since $M$ is a maximum matching, all edges of $M$ will have been chosen before the simple path is formed. At that point, Second will have made the last move and will win the game.

Problem 3.

  1. (20 pts) We consider the problem of scheduling a set $J$ of jobs on $M \in \mathbb{N}$ uniform parallel machines. Each job $j \in J$ has a processing requirement $p_j \in \mathbb{N}$ which denotes the number of days needed to process the job on any one of the machines. Each job also comes with a release date $r_j \in \mathbb{N}$, representing the beginning of the day the job $j$ becomes available for processing, and a due date $d_j \geq r_j+p_j$, representing the beginning of the day by the end of which job $j$ must be completed $\left(d_j\right.$ is also a natural number). We start our count of days from day 0 . Any machine can process only one job in a single day, and any job can be processed by at most one machine on a given day. However, we allow preemptions, i.e., we can interrupt a job and process it on different machines on different days (recall that all machines are uniform, i.e., have the same processing speed). The scheduling problem is determine if there exists a feasible schedule of jobs assigned to machines on different days such that all jobs are processed in their respective windows.

Give an algorithm that decides in polynomial time whether a feasible schedule exists or not. The algorithm should be polynomial time in the data, that is polynomial in $|J|, M, \sum_{j \in J}\left(\log r_j+\right.$ $\left.\log d_j+\log p_j\right)$.
[Hint: The following construction could be useful: Arrange the numbers $r_j, d_j$ in ascending order and create $2|J|-1$ intervals of consecutive days defined by these numbers.]

We can solve this problem using a modified version of the Interval Scheduling algorithm.

First, we construct $2|J|-1$ intervals of consecutive days defined by the release dates and due dates of the jobs. Specifically, we sort the jobs in non-decreasing order of their release dates, and then we create an interval starting from the release date of the first job and ending at the due date of the first job. Then, we iterate through the sorted list of jobs and for each job $j$, we create a new interval starting from the release date of $j$ and ending at the due date of $j$. We merge any overlapping intervals to obtain the final $2|J|-1$ intervals.

Next, we construct a directed acyclic graph $G$ as follows. We create a source node $s$ and a sink node $t$. For each job $j \in J$, we create two nodes $u_j$ and $v_j$. Node $u_j$ represents the start of job $j$, and node $v_j$ represents the completion of job $j$. We add directed edges from $s$ to $u_j$ with capacity $1$ for each interval that overlaps with the release window of job $j$. We add directed edges from $u_j$ to $v_j$ with capacity $1$ for each day in the processing window of job $j$. Finally, we add directed edges from $v_j$ to $t$ with capacity $1$ for each interval that overlaps with the due window of job $j$.

We then run the maximum flow algorithm on the graph $G$ with capacities defined as above, using any polynomial time maximum flow algorithm. If the maximum flow saturates all edges from $s$, then there exists a feasible schedule. Otherwise, there does not exist a feasible schedule.

The runtime of the algorithm is polynomial in $|J|, M, \sum_{j \in J}(\log r_j + \log d_j + \log p_j)$, since constructing the intervals and the graph takes $O(|J|\log|J|)$ time, and running the maximum flow algorithm takes polynomial time in the size of the graph, which is $O(|J|M)$, and hence the overall runtime is polynomial.

数学代考|组合优化代考Combinatorial Optimization考高分 认准UpriviateTA

线性代数后续课程数值分析代写成功案例

QR分解,Schmidt正交化,least sqaure method代写成功案例

凸优化代写成功案例