Algorithm Efficiency! Unveiling the Impact of Every Line of Code with Time and Space Complexity Analysis

Metrics provide a quantitative way to understand how the resource requirements of an algorithm grow with input size.

Narendra Harny
Make Android

--

Time and space complexity are essential concepts in computer science that help assess the efficiency of algorithms. These metrics furnish a quantitative way to comprehend how the resource requirements of an algorithm extend with input size.

credits

Time Complexity:

Time complexity counts the time an algorithm takes to finish as a function of the input size.

The upper bound of an algorithm’s growth rate is the “Big O” notation. The O(n) denotes linear time complexity, where the execution time increases proportionally with the input size.

Analyzing time complexity concerns calculating function executions, loop iterations, recursive calls, and significant computations contributing to the overall time cost. Understanding time complexity helps in selecting efficient algorithms for various problem sizes.

Space Complexity:

Space complexity evaluates the additional memory space an algorithm requires concerning the input size.

Like time complexity, It uses the Big O notation to represent the upper limit of space usage in the code execution. For example, O(1) represents regular space complexity, signifying that the memory conditions remain consistent regardless of input size.

Consider variables, data structures, and the call stack. Temporary storage for calculations and the size of dynamic data structures contribute to space requirements. Optimizing space complexity is crucial for efficient memory utilization, especially in resource-constrained environments.

Balancing Act:

Algorithms with lower time and space complexities are desired, but the preference relies on the application context and constraints.

Optimizing time complexity might involve trade-offs with space complexity and vice versa. The goal is often to strike a balance that meets the specific needs of the problem.

Understanding time and space complexity helps programmers to make informed judgments about algorithm preference and design. Efficient algorithms contribute to faster performance times and optimal resource utilization, playing a crucial role in software development.

The Problem Statement

A very basic problem of finding two numbers sum is equal to the given input for an integer array. from leetcode.

Example!!

Input: numberArray = [3,5,12,17], target = 8
Output: [0,1]
Explanation: numberArray[0] + numberArray[1] == 9, we return [0, 1].

Simple isn’t it?

I am going to write multiple solutions for the same problem and see the result.

Have a look at all the solutions for the same problem

Code Sample: (Runtime: 0ms) (Memory: 44.9 MB)

class Solution {
public int[] twoSum(int[] nums, int target) {
for(int i = 1; i < nums.length; i++){
for(int j = i; j < nums.length; j++){
if (nums[j] + nums[j - i] == target){
return new int[]{j, j - i};
}
}
}
return null;
}
}

Code Sample: Memory: 45 MB (Runtime: 1ms)

class Solution {
public int[] twoSum(int[] nums, int target) {
HashMap<Integer, Integer> hp =new HashMap<>();
for(int i=0;i<nums.length;i++){
if(hp.containsKey(nums[i])) return new int[] {i, hp.get(nums[i])} ;
hp.put(target-nums[i],i);
}
}
}

Code Sample (Memory: 42.1 MB) (Runtime: 88ms)

class Solution {
public int[] twoSum(int[] nums, int target) {
for(int i=0;i<nums.length;i++){
for(int j=i+1;j<nums.length;j++){
if(nums[i]+nums[j]==target){
int[] answer=new int[] {i,j};
System.gc();
return answer;
}
}
}
return null;
}
}

Analogy!!

Certainly! Let’s delve into more specific differences:

Solution 1:

Time Complexity: O(n²) due to nested loops.

Space Complexity: O(1) since it doesn’t use additional data structures.

Solution 2:

Time Complexity: O(n) — It only requires a single pass through the array.

Space Complexity: O(n) — It uses a HashMap to store complement values.

Solution 3:

Time Complexity: O(n²) due to nested loops.

Space Complexity: O(1) — It uses a constant amount of space.

Key Differences

Solution 1 and Solution 3 have a nested loop structure, resulting in quadratic time complexity, while Solution 2 uses a single loop for linear time complexity.

Solution 2 employs a HashMap for constant-time lookups, making it more efficient compared to Solution 1 and Solution 3.

Solution 3 has a “System.gc()” call, but this is generally discouraged. Garbage collection is managed by the Java Virtual Machine, and manually triggering it might not guarantee immediate memory release.

In summary, Solution 2 stands out for its more optimal time and space complexity, making it the preferred choice among the three solutions.

There is one more solution to notice

The most efficient solution for the two-sum problem involves using a HashMap, similar to Solution 2. Here’s a refined version:

class Solution {
public int[] twoSum(int[] nums, int target) {
HashMap<Integer, Integer> map = new HashMap<>();
for (int i = 0; i < nums.length; i++) {
int complement = target - nums[i];
if (map.containsKey(complement)) {
return new int[]{map.get(complement), i};
}
map.put(nums[i], i);
}
return null; // No solution found
}
}

This solution maintains a HashMap to store the complement of each element encountered so far. It iterates through the array once, performing constant-time lookups in the HashMap. This results in a time complexity of O(n) and a space complexity of O(n), making it a highly efficient solution.

What makes this solution better than the Solution two

Code Optimization and Code Quality are two possible conflicting or misleading points with each other so!!

In the refined solution, I made a few adjustments for clarity and best practices which are certainly not improving the performance but those adjustments are more on code quality. Here are the points!

Variable Naming: Renamed the hp (HashMap) to a more descriptive name, map. Clear and meaningful variable names improve code readability.

Complement Calculation: Introduced a variable complement to represent the difference between the target and the current array element. This enhances code readability and simplifies understanding.

Consistent Iteration: Used a standard for loop to iterate through the array, starting from index 0. This makes the code more consistent and aligns with typical array traversal.

Return Statement: Returned null explicitly when no solution is found. This is a more explicit and clear way of conveying that there is no valid answer.

Comments: Added a comment to indicate that the function returns null when no solution is found. Comments can be helpful for code maintenance and understanding.

These changes aim to make the code more readable, maintainable, and in line with best practices without altering the core algorithmic approach from Solution 2.

In conclusion, Code Optimization is working to improve the Algorithm whereas Code Quality is only related to the readability and maintainability kinds of stuff

Conclusion!

As mentioned Code Performance Optimization is related to the improvement in Algorithms which is based on two concepts that is Time and Space complexity.

Improving the Algorithm is mainly focused on reducing the iterative execution and reducing the size of resource consumption.

For example, creating an Object is not a big deal but when that object is being created in the loop then it becomes the most important to consider for optimization if possible.

If you want to read more on Time and Space Complexity Notation as per the data structure please refer to the PDF. click here

Thanks for Reading!

Please comment with questions and suggestions!! If this blog is helpful for you then hit Please hit the clap! Follow on Medium, Connect on LinkedIn, Follow on Insta and send emails if any discussion is needed on the same topic on copy email.
Please Follow & subscribe to Make Android Publication by Narendra K H for prompt updates on Android platform-related blogs.

Thank you!

Photo by Alaric Duan on Unsplash

--

--

Narendra Harny
Make Android

Connect on https://medium.com/make-android | Write On, Android AOSP & Applications | Python | DevOps | Java | C++ | Kotlin | Shell | Linux | Android Auto | IVI