Minimum Cost to Reach Destination in a Weighted Graph [Solution]
Solution 1: Dijkstra’s Algorithm
Intuition:
To find the minimum cost to reach the destination node from the source node in a weighted graph, we can use Dijkstra’s algorithm. Dijkstra’s algorithm finds the shortest path from a source node to all other nodes in the graph.
We’ll maintain a distance array to keep track of the minimum cost to reach each node from the source.
We’ll initialize the distance of the source node as 0
and all other nodes as infinity.
Then, we’ll repeatedly select the node with the minimum distance from the set of unvisited nodes, update the distances of its neighbors if a shorter path is found, and mark the node as visited.
We’ll continue this process until we visit all nodes or until we find the destination node.
If after completing the algorithm the distance to the destination node is still infinity, it means there is no path from the source to the destination, and we return -1
.
Otherwise, we return the distance to the destination node.
Solution:
import sys
from typing import List
def min_cost_to_reach_destination(graph: List[List[int]], source: int, destination: int) -> int:
# Number of nodes in the graph
n = len(graph)
# Initialize distances array, initially all set to infinity
distances = [sys.maxsize] * n
distances[source] = 0 # Distance from source to itself is 0
# Initialize visited array to keep track of visited nodes
visited = [False] * n
# Iterate over all nodes
for _ in range(n):
# Find the node with the minimum distance from the set of unvisited nodes
min_distance = sys.maxsize
min_node = -1
for i in range(n):
if not visited[i] and distances[i] < min_distance:
min_distance = distances[i]
min_node = i
# If no unvisited node found, break
if min_node == -1:
break
# Mark the selected node as visited
visited[min_node] = True
# Update distances of neighbors if a shorter path is found
for j in range(n):
if graph[min_node][j] > 0 and not visited[j]:
distances[j] = min(distances[j], distances[min_node] + graph[min_node][j])
# If the distance to the destination node is still infinity, return -1
if distances[destination] == sys.maxsize:
return -1
else:
return distances[destination]
# Test cases
graph = [
[0, 10, 20, 0],
[0, 0, 5, 10],
[0, 0, 0, 20],
[0, 0, 0, 0]
]
result = min_cost_to_reach_destination(graph, 0, 3)
print(result) # Output: 20
assert result == 20
result = min_cost_to_reach_destination(graph, 3, 0)
print(result) # Output: -1
assert result == -1
graph = [
[0, 10, 20, 0],
[0, 0, 5, 10],
[0, 0, 0, 2],
[0, 0, 0, 0]
]
result = min_cost_to_reach_destination(graph, 0, 3)
print(result) # Output: 17
assert result == 17
Time Complexity:
The time complexity of Dijkstra’s algorithm is O(V^2
) for the implementation using adjacency matrix, where V
is the number of vertices in the graph.
Space Complexity:
The space complexity is O(V
) to store the distances array, visited array, and other variables, where V
is the number of vertices in the graph.
Additionally, the space complexity of the graph itself is O(V^2
) for the adjacency matrix representation.
Therefore, the overall space complexity is O(V^2
).
Solution 2: Brute Force
Intuition:
The brute force approach involves exhaustively searching through all possible paths from the source node to the destination node and finding the minimum cost among them. Here’s how we can implement the brute force approach:
- We’ll use recursion to explore all possible paths from the source node to the destination node.
- At each step, we’ll try to move to all adjacent nodes from the current node and recursively explore each path.
- We’ll keep track of the cost incurred so far in each path.
- Once we reach the destination node, we’ll compare the cost of the current path with the minimum cost found so far and update it if the current path has a lower cost.
- We’ll continue this process until we explore all possible paths from the source node to the destination node.
Solution:
from typing import List
def min_cost_to_reach_destination(graph: List[List[int]], source: int, destination: int) -> int:
# Helper function for recursive path exploration
def explore_path(node: int, cost_so_far: int) -> int:
# Base case: If the current node is the destination, return the cost so far
if node == destination:
return cost_so_far
# Initialize the minimum cost to a large value
min_cost = float('inf')
# Explore all neighbors of the current node
for neighbor, edge_cost in enumerate(graph[node]):
if edge_cost > 0: # If there is a connection to the neighbor
# Recursively explore the path from the neighbor node
# Update the minimum cost if a lower cost path is found
min_cost = min(min_cost, explore_path(neighbor, cost_so_far + edge_cost))
return min_cost
# Start exploring paths from the source node
min_path_cost = explore_path(source, 0)
# If the minimum path cost is still infinity, it means no path exists
if min_path_cost == float('inf'):
return -1
else:
return min_path_cost
# Test cases
graph = [
[0, 10, 20, 0],
[0, 0, 5, 10],
[0, 0, 0, 20],
[0, 0, 0, 0]
]
result = min_cost_to_reach_destination(graph, 0, 3)
print(result) # Output: 20
assert result == 20
result = min_cost_to_reach_destination(graph, 3, 0)
print(result) # Output: -1
assert result == -1
graph = [
[0, 10, 20, 0],
[0, 0, 5, 10],
[0, 0, 0, 2],
[0, 0, 0, 0]
]
result = min_cost_to_reach_destination(graph, 0, 3)
print(result) # Output: 17
assert result == 17
Time Complexity:
Because this approach exhaustively explores all possible paths, it is highly inefficient, especially for large graphs, as it has an exponential time complexity.
The time complexity of this approach is O(2^V
), where V
is the number of vertices in the graph.
Here’s why:
-
Recursive Calls: At each step of the recursion, we explore all possible neighboring nodes of the current node. This leads to a branching factor of at most O(
V
), whereV
is the number of vertices. This is because, in the worst case, every node can be connected to every other node, leading to V possible choices at each step. -
Recursion Depth: The depth of the recursion tree corresponds to the length of the path from the source node to the destination node. In the worst case scenario, where the destination node is at the maximum distance from the source node, the recursion depth could be as large as
V
.
Combining these factors, at each level of recursion, we potentially explore V
nodes, and there can be up to V
levels in the recursion tree. Therefore, the total number of function calls (or nodes in the recursion tree) grows exponentially with the number of vertices.
As a result, the time complexity of the brute force approach is exponential, making it highly inefficient, especially for large graphs. This exponential growth makes the brute force approach impractical for real-world scenarios with graphs of significant size.
Space Complexity:
In the brute force approach, the space complexity mainly depends on the recursion depth.
-
Recursion Depth: The recursion depth in this approach is bounded by the length of the longest path from the source node to the destination node. In the worst case scenario, where all paths are explored, the recursion depth could be as large as the number of vertices in the graph. Therefore, the space complexity due to recursion depth is O(
V
), whereV
is the number of vertices. -
Auxiliary Space: Apart from recursion, the auxiliary space used for maintaining variables, such as the minimum cost and the cost incurred so far in each recursive call, is negligible compared to the recursion depth. It is independent of the input size and can be considered constant.
So, the overall space complexity of the brute force approach is O(V
) due to the recursion depth.