<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[ValueCurve]]></title><description><![CDATA[From Insights to Impact - Human Focus in the AI world. ]]></description><link>https://on.valuecurve.ai</link><generator>Substack</generator><lastBuildDate>Sat, 11 Apr 2026 22:24:40 GMT</lastBuildDate><atom:link href="https://on.valuecurve.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Sarfaraz Mulla]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[valuecurve@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[valuecurve@substack.com]]></itunes:email><itunes:name><![CDATA[Sarfaraz Mulla]]></itunes:name></itunes:owner><itunes:author><![CDATA[Sarfaraz Mulla]]></itunes:author><googleplay:owner><![CDATA[valuecurve@substack.com]]></googleplay:owner><googleplay:email><![CDATA[valuecurve@substack.com]]></googleplay:email><googleplay:author><![CDATA[Sarfaraz Mulla]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Open Models Close the Gap, Gemma 4 Goes MoE, and a Model Writes a Chromosome]]></title><description><![CDATA[Anthropic revealed how LLMs process emotion internally, a genomics model generated its first functional chromosome, and open models functionally matched their closed rivals on agentic tasks.]]></description><link>https://on.valuecurve.ai/p/open-models-close-the-gap-gemma-4</link><guid isPermaLink="false">https://on.valuecurve.ai/p/open-models-close-the-gap-gemma-4</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 05 Apr 2026 14:43:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!b1th!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!b1th!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!b1th!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 424w, https://substackcdn.com/image/fetch/$s_!b1th!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 848w, https://substackcdn.com/image/fetch/$s_!b1th!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!b1th!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!b1th!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:631521,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://on.valuecurve.ai/i/193257286?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!b1th!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 424w, https://substackcdn.com/image/fetch/$s_!b1th!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 848w, https://substackcdn.com/image/fetch/$s_!b1th!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!b1th!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fce1d48e4-6417-40b1-a57f-c92f0dbff771_3840x2160.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h3><strong>1. <a href="https://www.anthropic.com/research/emotion-concepts-function">Anthropic Maps How LLMs Process Emotion</a></strong></h3><p><a href="https://www.anthropic.com/">Anthropic</a> published new interpretability research examining how emotional concepts are represented and utilized inside large language models. The work studies the internal processing systems that govern how models handle emotional representations &#8212; part of Anthropic&#8217;s broader program to understand what models are actually doing rather than what we assume they do.</p><p>Any team using system prompts to control tone &#8212; friendly for customer support, precise for technical documentation, professional for sales &#8212; is relying on the model&#8217;s internal representation of emotional and tonal concepts. Understanding how those representations actually work informs better prompt engineering and more targeted preference training. It also raises a design question: if models have internal structures that process emotional framing, then tone is not just a surface-level instruction but something the model reasons about at a deeper level. For teams running DPO or RLHF where tone is a preference criterion, this research offers a window into the mechanism being optimized.</p><h3><strong>2. <a href="https://arcinstitute.org/blog/evo2">Evo2 &#8212; A Foundation Model That Writes Chromosomes</a></strong></h3><p><a href="https://arcinstitute.org/">Arc Institute</a> released <a href="https://huggingface.co/arcinstitute/evo2_7b">Evo2</a>, a 300-billion-parameter genomics foundation model with a 131,000-token context window, trained on 9.3 trillion nucleotides from the OpenGenome2 dataset across multiple species. It can generate entire functional chromosomes and predict the effects of genetic mutations.</p><p>Previous DNA models were limited to roughly 8,000 tokens &#8212; far too short to capture the long-range dependencies that govern how genes actually function. Evo2&#8217;s leap to 131K context is what makes chromosome-scale generation possible. The release philosophy is equally significant: in a field where bio-AI models are increasingly locked behind pharmaceutical company walls, Arc Institute published both open weights and the full training dataset. The &#8220;scale plus long context&#8221; approach that transformed language models transfers directly to biological sequences. That finding extends well beyond genomics.</p><div><hr></div><h3><strong>3. <a href="https://blog.langchain.com/open-models-have-crossed-a-threshold/">Open Models Cross the Threshold &#8212; and Start Healing Themselves</a></strong></h3><p><a href="https://github.com/langchain-ai/langchain">LangChain</a> published two significant pieces this week. The first is an analysis showing that open models like GLM-5 and <a href="https://huggingface.co/MiniMaxAI">MiniMax M2.7</a> now match closed frontier models on core agent capabilities, while delivering significantly lower costs and faster latency in production. The second describes a <a href="https://blog.langchain.com/production-agents-self-heal/">self-healing deployment pipeline</a> for production agents &#8212; an automated system that detects regressions after each deploy, diagnoses the root cause, and opens a pull request with a fix before a human needs to intervene.</p><p>Together, these two developments mark a shift in maturity. The capability gap between open and closed models has functionally closed for agentic tasks &#8212; the economics now favour self-hosted inference, where per-token API billing is replaced by fixed infrastructure costs. And the production story is catching up too. Building a capable agent is one challenge; keeping it working as dependencies shift and data distributions drift is another. The self-healing pattern closes the loop from detection to remediation automatically, transforming maintenance from a reactive burden into an automated feedback loop. Open models are not just matching closed ones on benchmarks &#8212; the tooling around them is reaching production grade.</p><h3><strong>4. <a href="https://deepmind.google/blog/gemma-4-byte-for-byte-the-most-capable-open-models/">Gemma 4 Arrives with MoE, Agentic Design, and a Ready Serving Stack</a></strong></h3><p><a href="https://deepmind.google/">Google DeepMind</a> released <a href="https://huggingface.co/collections/google/gemma-4-686ca40b30bd7d89a0befd0e">Gemma 4</a>, their most capable open model family to date. The architecture combines Mixture of Experts with native multimodal input, structured reasoning, and tool-use capabilities. <a href="https://github.com/vllm-project/vllm/releases/tag/v0.19.0">vLLM v0.19.0</a> shipped the same week with full Gemma 4 support, along with zero-bubble async scheduling and speculative decoding &#8212; 448 commits from 197 contributors.</p><p>MoE decouples capability from inference cost. A model with a large total parameter count but fewer active parameters per token delivers frontier-level quality at a fraction of the compute. The simultaneous vLLM release means the path from download to production serving is already clear. Speculative decoding, which generates candidate tokens with a smaller draft model and verifies them in batch, adds a meaningful latency improvement on top. For teams maintaining a model evaluation matrix, Gemma 4 is an immediate candidate with its serving infrastructure already in place.</p><div><hr></div><h3><strong>5. <a href="https://www.technologyreview.com/2026/03/31/1134833/ai-benchmarks-are-broken-heres-what-we-need-instead/">HAIC &#8212; A New Framework for Evaluating AI in Teams</a></strong></h3><p><a href="https://www.technologyreview.com/">MIT Technology Review</a> published a piece arguing that current AI evaluation methods are fundamentally misaligned with how AI is actually deployed. Benchmarks test isolated tasks against human performance, but production AI operates as part of collaborative human-AI teams within organizational workflows.</p><p>The proposed alternative is HAIC &#8212; Human-AI, Context-Specific Evaluation. Instead of asking &#8220;can this model solve this problem alone?&#8221;, HAIC asks &#8220;does this model improve outcomes when embedded in a team over time?&#8221; The distinction changes what gets optimized. Current benchmarks reward standalone capability. HAIC rewards integration quality, collaborative efficiency, and sustained performance under real-world conditions. For teams that have seen a model ace benchmarks and then underperform in production, this framework offers a concrete direction forward. Benchmarks drive development priorities &#8212; when the benchmarks measure the wrong thing, the priorities follow.</p><h3><strong>6. <a href="https://huggingface.co/blog/Hcompany/holo3">Holo3 &#8212; An Open Model for Computer Use</a></strong></h3><p><a href="https://huggingface.co/Hcompany">HCompany</a> released Holo3, an open model for computer use and UI automation tasks. The model pushes the boundaries of how agents interact with desktop and web interfaces &#8212; clicking, typing, navigating, and completing multi-step workflows across applications.</p><p>Computer use is a rapidly evolving agent category. Anthropic&#8217;s Claude, OpenAI&#8217;s Operator, and Google&#8217;s Project Mariner have all staked positions, but these are closed systems. Holo3 is an open-weight alternative published directly on Hugging Face. For teams building agent-based products, an open computer-use model means the capability can be fine-tuned, self-hosted, and integrated into custom workflows without API dependencies. As agentic architectures move from chat-based interactions to tool-use and environment manipulation, computer use becomes a core capability layer rather than a novelty demo.</p><div><hr></div><h3><strong>What Ties These Together</strong></h3><p>Open models are not just closing the capability gap with closed alternatives &#8212; they are building an ecosystem around themselves. Better serving infrastructure ships in lockstep with new architectures. Interpretability research reveals what these models are actually doing under the hood. Evaluation frameworks are evolving to match how AI is actually used. Production patterns are maturing from monitoring to automated remediation. The question has shifted from &#8220;can open models compete?&#8221; to &#8220;what does the full production stack look like when open models are the default?&#8221; That is where the industry is now building.</p>]]></content:encoded></item><item><title><![CDATA[Forgejo: Self-Hosted Git Infrastructure]]></title><description><![CDATA[Forgejo is a community-driven fork of Gitea, which itself descended from Gogs. It became a hard fork under Codeberg e.V., achieving full independence from its upstream.]]></description><link>https://on.valuecurve.ai/p/forgejo-self-hosted-git-infrastructure</link><guid isPermaLink="false">https://on.valuecurve.ai/p/forgejo-self-hosted-git-infrastructure</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 04 Jan 2026 03:31:20 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/4f97c97b-c4f6-4064-a832-1eebf48b1193_3999x2666.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>GitHub and GitLab run on centralized infrastructure. Your repositories live on their servers, subject to their pricing, policies, and availability. Forgejo is a self-hosted alternative&#8212;you run the Git server on your own hardware.</p><p>This isn&#8217;t inherently better or worse. It&#8217;s a different set of tradeoffs: swap SaaS convenience for control, predictability, and (usually) lower costs as you scale.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!isKy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!isKy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 424w, https://substackcdn.com/image/fetch/$s_!isKy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 848w, https://substackcdn.com/image/fetch/$s_!isKy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 1272w, https://substackcdn.com/image/fetch/$s_!isKy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!isKy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png" width="728" height="276.2985074626866" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:356,&quot;width&quot;:938,&quot;resizeWidth&quot;:728,&quot;bytes&quot;:112826,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/182708370?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!isKy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 424w, https://substackcdn.com/image/fetch/$s_!isKy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 848w, https://substackcdn.com/image/fetch/$s_!isKy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 1272w, https://substackcdn.com/image/fetch/$s_!isKy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fad694385-9a84-45c8-ad12-32c2bc13cf9e_938x356.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h4>What Forgejo Provides</h4><p><a href="https://forgejo.org">Forgejo</a> started as a soft fork of Gitea in 2022 over governance concerns. In February 2024, it became a hard fork under Codeberg e.V., achieving full independence from its upstream. The latest stable release is v13.0 (October 2025), bringing WebAuthn support and 15% memory optimizations.</p><p>Core functionality:</p><ul><li><p>Git repository hosting</p></li><li><p>Pull request and code review workflow</p></li><li><p>Issue tracking with labels and milestones</p></li><li><p>Built-in CI/CD (Forgejo Actions)</p></li><li><p>Package registry (Docker, npm, Maven, PyPI)</p></li><li><p>Git LFS support</p></li><li><p>OAuth/LDAP/SAML authentication</p></li><li><p>Federation for multi-instance collaboration</p></li><li><p>AI code suggestions (opt-in plugins)</p></li></ul><p>The interface resembles GitHub. Teams familiar with GitHub&#8217;s workflow navigate Forgejo without training.</p><h4>Cost Structure</h4><p>GitHub Teams: $5/user/month ($60/year per user) GitLab Premium: $32/user/month ($384/year per user)</p><p>For 10 users:</p><ul><li><p>GitHub: $600/year</p></li><li><p>GitLab: $3,840/year</p></li></ul><p>Forgejo on a VPS:</p><ul><li><p>Hetzner CX21: &#8364;4.85/month (~$52/year)</p></li><li><p>Digital Ocean Droplet (2GB): $12/month ($144/year)</p></li></ul><p>These VPS prices work for 10 users or 100 users. No per-seat scaling.</p><p>Hidden costs:</p><ul><li><p>Setup time: 4-6 hours</p></li><li><p>Ongoing maintenance: 2-4 hours/month</p></li><li><p>Backup storage: $1-5/month</p></li></ul><h2>Technical Requirements</h2><p>Minimum specs:</p><ul><li><p>1GB RAM (tight, but usable)</p></li><li><p>2 CPU cores</p></li><li><p>20GB storage + repository sizes</p></li><li><p>Linux server (Ubuntu, Debian)</p></li></ul><p>Recommended for production:</p><ul><li><p>2GB RAM</p></li><li><p>2-4 CPU cores</p></li><li><p>Storage: 50GB base + 1.3x repository sizes</p></li><li><p>SSD strongly recommended</p></li></ul><p>Database options:</p><ul><li><p><strong>SQLite</strong>: Simple, single file. Works until concurrent operations slow down (10-20 active users).</p></li><li><p><strong>PostgreSQL</strong>: Better concurrency. Migrate when SQLite bottlenecks.</p></li></ul><h4>Installation</h4><p>Docker Compose setup with health checks:</p><pre><code><code>services:
  forgejo:
    image: codeberg.org/forgejo/forgejo:9-rootless
    container_name: forgejo
    environment:
      - USER_UID=1000
      - USER_GID=1000
      - FORGEJO__database__DB_TYPE=postgres
      - FORGEJO__database__HOST=db:5432
      - FORGEJO__database__NAME=forgejo
      - FORGEJO__database__USER=forgejo
      - FORGEJO__database__PASSWD=your_password_here
      - FORGEJO__server__ROOT_URL=https://git.yourdomain.com
      - FORGEJO__server__SSH_PORT=222
    volumes:
      - ./data:/data
      - /etc/timezone:/etc/timezone:ro
      - /etc/localtime:/etc/localtime:ro
    ports:
      - "3000:3000"
      - "222:22"
    depends_on:
      db:
        condition: service_healthy
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:3000"]
      interval: 30s
      timeout: 10s
      retries: 3
    restart: unless-stopped

  db:
    image: postgres:17-alpine
    environment:
      - POSTGRES_USER=forgejo
      - POSTGRES_PASSWORD=your_password_here
      - POSTGRES_DB=forgejo
    volumes:
      - ./postgres:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U forgejo"]
      interval: 10s
      timeout: 5s
      retries: 5
    restart: unless-stopped
</code></code></pre><p>Notes:</p><ul><li><p>Rootless image for simpler permissions</p></li><li><p>SSH on port 222 avoids host conflicts</p></li><li><p>Health checks ensure service readiness</p></li><li><p>PostgreSQL 17 for latest performance improvements</p></li></ul><p>Start: <code>docker-compose up -d</code></p><p>First run launches web installer at </p><p>http://your-server:3000</p><p>. Configure admin credentials, server URL, and email settings.</p><h4>Reverse Proxy and SSL</h4><p>Forgejo serves HTTP on port 3000. Production needs HTTPS.</p><p><strong>Caddy</strong> (automatic SSL):</p><pre><code><code>git.yourdomain.com {
    reverse_proxy forgejo:3000
}
</code></code></pre><p><strong>Nginx</strong>:</p><pre><code><code>server {
    listen 443 ssl http2;
    server_name git.yourdomain.com;
    
    ssl_certificate /etc/letsencrypt/live/git.yourdomain.com/fullchain.pem;
    ssl_certificate_key /etc/letsencrypt/live/git.yourdomain.com/privkey.pem;
    
    location / {
        proxy_pass http://localhost:3000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}
</code></code></pre><p>Get certificates: <code>certbot --nginx -d git.yourdomain.com</code></p><h4>Basic Usage</h4><p><strong>Repositories:</strong></p><p>Organizations group repositories and manage team permissions. Create organizations for team projects, user accounts for personal repos.</p><p>Clone with custom SSH port:</p><pre><code><code>git clone ssh://git@git.yourdomain.com:222/org/repo.git
</code></code></pre><p><strong>Pull Requests:</strong></p><p>Standard GitHub workflow:</p><ol><li><p>Branch or fork</p></li><li><p>Push changes</p></li><li><p>Open pull request</p></li><li><p>Inline code review</p></li><li><p>Merge when approved</p></li></ol><p><strong>Issues:</strong></p><p>Support Markdown, labels, milestones, assignees, attachments. Reference commits: <code>fixes #123</code> creates automatic links.</p><p><strong>Webhooks:</strong></p><p>POST JSON to external endpoints on push, PR, issue, or release events. Configure in repository settings for CI/CD triggers.</p><p><strong>Federation:</strong></p><p>Connect multiple Forgejo instances for cross-instance collaboration. Users on one instance can interact with repositories on another. Enable in admin settings.</p><h4>CI/CD: Forgejo Actions</h4><p>GitHub Actions-compatible. Run your own runners.</p><p><strong>Runner Setup:</strong></p><pre><code><code>services:
  runner:
    image: code.forgejo.org/forgejo/runner:3.3.0
    environment:
      - FORGEJO_INSTANCE_URL=https://git.yourdomain.com
      - FORGEJO_RUNNER_REGISTRATION_TOKEN=your_token
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    restart: unless-stopped
</code></code></pre><p>Token from: Site Administration &#8594; Actions &#8594; Runners</p><p><strong>Workflow Example:</strong></p><p><code>.forgejo/workflows/test.yml</code>:</p><pre><code><code>name: Test
on: [push, pull_request]

jobs:
  test:
    runs-on: docker
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v4
        with:
          python-version: '3.11'
      - run: pip install -r requirements.txt
      - run: pytest
</code></code></pre><p>Most GitHub Actions work without modification.</p><h4>AI Code Suggestions</h4><p>V13 introduces opt-in AI plugins for code completion and suggestions. Configure via admin panel. Integrates with OpenAI API or self-hosted models. Privacy-focused: code never leaves your infrastructure if using local models.</p><p>Enable in: Site Administration &#8594; AI Settings</p><h4>Backup Strategy</h4><p>Critical data:</p><ol><li><p>Repositories (<code>./data</code> volume)</p></li><li><p>PostgreSQL database</p></li><li><p>Configuration (<code>app.ini</code>)</p></li></ol><p><strong>Daily Backup:</strong></p><pre><code><code>#!/bin/bash
BACKUP_DIR="/backups/forgejo"
DATE=$(date +%Y%m%d_%H%M%S)

# Database backup (no downtime)
docker-compose exec -T db pg_dump -U forgejo forgejo | gzip &gt; $BACKUP_DIR/db_$DATE.sql.gz

# Repository backup
tar -czf $BACKUP_DIR/data_$DATE.tar.gz ./data

# Retention: 7 days
find $BACKUP_DIR -mtime +7 -delete
</code></code></pre><p><strong>Off-site:</strong></p><pre><code><code>rclone sync /backups/forgejo s3:bucket-name/forgejo
</code></code></pre><p>Test restores regularly.</p><h2>Updates</h2><p>Updates every 4-8 weeks.</p><pre><code><code>docker-compose pull
docker-compose up -d
docker-compose logs -f forgejo
</code></code></pre><p>Downtime: 30-60 seconds.</p><p><strong>Security:</strong></p><ul><li><p>Enable 2FA for admins</p></li><li><p>SSH key-based auth only</p></li><li><p>Strong database passwords</p></li><li><p>Keep host OS updated</p></li></ul><h2>When to Self-Host</h2><p><strong>Good fit:</strong></p><ul><li><p>You manage servers already</p></li><li><p>Stable team size</p></li><li><p>Data sovereignty requirements</p></li><li><p>Infrastructure control preference</p></li></ul><p><strong>Poor fit:</strong></p><ul><li><p>No operational expertise</p></li><li><p>Rapid unpredictable growth</p></li><li><p>Critical uptime without HA capability</p></li><li><p>Small teams focused on product</p></li></ul><h4>Getting Started</h4><p>Evaluation:</p><ol><li><p>Deploy local instance</p></li><li><p>Create test repository</p></li><li><p>Try workflows (clone, commit, PR)</p></li><li><p>Test Actions</p></li><li><p>Practice backup/restore</p></li></ol><p>Resources:</p><ul><li><p>Docs: forgejo.org/docs</p></li><li><p>Forum: codeberg.org/forgejo/discussions</p></li><li><p>Matrix: #forgejo:matrix.org</p></li><li><p>Video: <strong>Forgejo Tutoria</strong>l by <a href="https://youtu.be/FPVpKCvFQr8?si=jKiajd5mCnHj8YTb">Awesome Open Source</a></p></li></ul><h4>Bottom Line</h4><p>Forgejo trades SaaS convenience for infrastructure control. You eliminate per-seat costs and external dependencies at the cost of operational responsibility.</p><p>The software is stable. Choose based on whether managing it aligns with your team&#8217;s capabilities, not ideology.</p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[The Hidden Cost of Accidental Data Exposure]]></title><description><![CDATA[Personally Identifiable Information is any data that can identify a specific individual, either directly or when combined with other information.]]></description><link>https://on.valuecurve.ai/p/why-your-data-might-be-leaking-pii</link><guid isPermaLink="false">https://on.valuecurve.ai/p/why-your-data-might-be-leaking-pii</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 28 Dec 2025 03:31:21 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/e8c52428-1c32-4dc9-9846-2158e1913d9f_6000x3335.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Every day, sensitive information leaks through channels we don&#8217;t think twice about: a stack trace pasted into a Slack message, a customer email forwarded to a vendor, a debug log shared in a GitHub issue. These aren&#8217;t malicious breaches&#8212;they&#8217;re ordinary workflows that happen to contain data that shouldn&#8217;t be shared.</p><p>The problem isn&#8217;t carelessness. It&#8217;s that <strong>PII (Personally Identifiable Information) </strong>is often invisible until you know to look for it.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;82d745bd-5e2c-43d8-a5c6-5a641ebe818e&quot;,&quot;duration&quot;:null}"></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h4>What Counts as PII?</h4><p>PII is any data that can identify a specific individual, either directly or when combined with other information. The definition varies by regulation, but generally includes:</p><p><strong>Direct identifiers</strong> &#8212; Data that points to a specific person on its own:</p><ul><li><p>Full names</p></li><li><p>Email addresses</p></li><li><p>Phone numbers</p></li><li><p>Social Security Numbers</p></li><li><p>Passport and driver&#8217;s license numbers</p></li><li><p>Biometric data</p></li></ul><p><strong>Indirect identifiers</strong> &#8212; Data that can identify someone when combined:</p><ul><li><p>IP addresses</p></li><li><p>Device IDs</p></li><li><p>Location data</p></li><li><p>Dates of birth</p></li><li><p>Employment information</p></li></ul><p><strong>Financial data</strong> &#8212; Often regulated separately but equally sensitive:</p><ul><li><p>Credit card numbers</p></li><li><p>Bank account and routing numbers</p></li><li><p>IBAN codes</p></li></ul><p><strong>Authentication secrets</strong> &#8212; Not traditionally &#8220;PII&#8221; but equally dangerous:</p><ul><li><p>API keys and tokens</p></li><li><p>Passwords</p></li><li><p>Private keys</p></li><li><p>Session tokens</p></li></ul><p>The last category is often overlooked. An exposed AWS key isn&#8217;t personal information, but it can grant access to systems containing millions of personal records. The blast radius of a leaked credential often exceeds that of a leaked SSN.</p><blockquote><p><strong>Credentials are the keys to the PII vault.</strong> A single exposed API key can unlock databases containing millions of personal records. That&#8217;s why effective scanning must detect both PII and secrets.</p></blockquote><div><hr></div><h4>Where PII Hides</h4><p>The obvious places&#8212;databases, CRM systems, HR files&#8212;usually have controls. The risk is in the unstructured data that flows through daily work:</p><p><strong>Support tickets</strong>: A customer reports a bug and includes their full account details. The ticket gets escalated, exported to a spreadsheet, shared with engineering. Each hop increases exposure.</p><p><strong>Log files</strong>: Application logs capture request parameters, user IDs, IP addresses, sometimes full payloads. Developers copy these into debugging sessions, paste them into chat, attach them to tickets.</p><p><strong>Code repositories</strong>: Test files contain sample data. Configuration files contain connection strings. Comments contain &#8220;temporary&#8221; credentials. README files contain example API calls with real tokens.</p><p><strong>AI prompts</strong>: Users paste customer conversations, error messages, database queries into ChatGPT or Claude for help. These prompts may be used for model training unless explicitly opted out.</p><p><strong>Email threads</strong>: A message gets forwarded, then forwarded again. By the fifth hop, nobody remembers that the original contained a customer&#8217;s SSN in the signature block.</p><p><strong>Screenshots</strong>: A developer shares a screenshot of a bug. The browser&#8217;s address bar shows a URL with a session token. The page content shows a user&#8217;s profile.</p><h4>The Regulatory Landscape</h4><p>Data protection regulations have teeth. Under GDPR, fines can reach &#8364;20 million or 4% of global annual revenue&#8212;whichever is higher. CCPA allows statutory damages of $100&#8211;$750 per consumer, per incident. A single leak of 1,000 customer emails could theoretically result in a $750,000 liability.</p><p>But the real cost is often operational:</p><ul><li><p><strong>Breach notification requirements</strong>: GDPR requires notification within 72 hours. This means incident response, legal review, customer communication&#8212;all on a tight timeline.</p></li><li><p><strong>Right to erasure</strong>: If you can&#8217;t track where data has been copied, you can&#8217;t guarantee deletion.</p></li><li><p><strong>Audit requirements</strong>: Demonstrating compliance requires knowing what data you have and where it lives.</p></li></ul><p>Most breaches don&#8217;t make headlines. They&#8217;re discovered during audits, reported by customers, or found by security researchers. The exposure may have existed for months before detection.</p><h4>Detection is Harder Than It Looks</h4><p>Why doesn&#8217;t everyone just scan for PII before sharing? Because detection is genuinely difficult:</p><p><strong>Format variation</strong>: Phone numbers appear as (555) 123-4567, 555-123-4567, +1 555 123 4567, 5551234567. Email addresses get obfuscated as john[at]example[dot]com. Credit cards have spaces, dashes, or neither.</p><p><strong>False positives</strong>: A 9-digit number might be an SSN or a random ID. A 16-digit number might be a credit card or a tracking number. Without validation, scanners either miss things or flag everything.</p><p><strong>Context matters</strong>: &#8220;John Smith&#8221; in a novel isn&#8217;t PII. &#8220;John Smith, Account #12345&#8221; in a support ticket is. Simple pattern matching can&#8217;t distinguish.</p><p><strong>Secrets are diverse</strong>: AWS keys start with AKIA. GitHub tokens start with ghp_. Stripe keys start with sk_live_. Generic API keys follow no pattern at all. Each requires specific detection logic.</p><p><strong>Encoding layers</strong>: Data gets base64 encoded, embedded in JSON, nested in XML. A scanner that only checks surface text misses encoded content.</p><div><hr></div><h4>Building a Detection Approach</h4><p>Effective PII detection combines multiple techniques:</p><p><strong>Pattern matching</strong> handles well-formatted data. SSNs follow XXX-XX-XXXX. Credit cards match specific prefixes (4 for Visa, 5 for Mastercard). Email addresses have predictable structure.</p><p><strong>Checksum validation</strong> reduces false positives. Credit card numbers include a check digit validated by the Luhn algorithm. IBANs have country-specific formats with built-in verification. A random 16-digit number fails these checks.</p><p><strong>Prefix detection</strong> catches credentials. Cloud provider keys use identifiable prefixes: <code>AKIA</code> for AWS access keys, <code>ghp_</code> for GitHub tokens, <code>sk_live_</code> for Stripe, <code>AIza</code> for Google APIs. These prefixes exist specifically to enable detection.</p><p><strong>Confidence scoring</strong> acknowledges uncertainty. A pattern match against an SSN format with proper separators is high confidence. A 9-digit number without context is low confidence. Surfacing the score lets users prioritize review.</p><div><hr></div><h4>A Tool to Help</h4><p>We built <a href="https://build.valuecurve.co/tools/privacy-scanner/">Privacy Scanner</a> to make PII detection accessible. It&#8217;s a free, browser-based tool that identifies sensitive data in text and files.</p><p>The scanner detects:</p><ul><li><p>Email addresses (including obfuscated formats)</p></li><li><p>Phone numbers (US and international)</p></li><li><p>Social Security Numbers</p></li><li><p>Credit cards (with Luhn validation)</p></li><li><p>Physical addresses (US, UK, EU formats)</p></li><li><p>Bank account numbers and IBANs</p></li><li><p>Cloud credentials (AWS, GitHub, Stripe, Google, Azure, Slack)</p></li><li><p>JWT tokens and private key headers</p></li><li><p>Passwords in plaintext</p></li></ul><p>Each detection includes a confidence score and contributes to an overall risk assessment. The tool generates a redacted preview you can copy directly.</p><p>For sensitive use cases, there&#8217;s a &#8220;Browsers only mode&#8221; where your text never leaves the browser&#8212;the backend only returns coordinates, and masking happens locally.</p><p>No signup required. No data stored.</p><p><strong><a href="https://build.valuecurve.co/tools/privacy-scanner/">Try it here &#8594;</a></strong></p><h4>Building Habits</h4><p>Tools help, but habits matter more. Some practices that reduce accidental exposure:</p><p><strong>Assume logs contain PII</strong>. Before sharing any log output, scan it. Better yet, configure your logging framework to redact sensitive fields at the source.</p><p><strong>Sanitize before escalation</strong>. When forwarding a customer issue, take 30 seconds to remove identifying details that aren&#8217;t necessary for resolution.</p><p><strong>Use separate test data</strong>. Maintain a library of fake but realistic test data. Never copy production data into development environments without anonymization.</p><p><strong>Review before commit</strong>. Add PII scanning to your pre-commit hooks. Catch credentials and test data before they enter version control.</p><p><strong>Question AI prompts</strong>. Before pasting into an LLM, ask: does this contain customer data? Could this identify someone? Is there a way to get the same help with anonymized input?</p><p>Privacy incidents rarely involve sophisticated attacks. They happen when ordinary people do ordinary things without realizing what&#8217;s embedded in the data they&#8217;re handling. The fix isn&#8217;t perfect security&#8212;it&#8217;s awareness and accessible tools.</p><div><hr></div><p><em>Questions or feedback? Post your comments, we're improving the scanner based on feedback.</em></p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[vLLM enables 100-request concurrency on a single RTX A5000 for 7B LLMs (benchmark)
]]></title><description><![CDATA[Benchmarking against HuggingFace Transformers, a meaningful baseline that practitioners universally understand]]></description><link>https://on.valuecurve.ai/p/vllm-enables-100-request-concurrency</link><guid isPermaLink="false">https://on.valuecurve.ai/p/vllm-enables-100-request-concurrency</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 14 Dec 2025 03:31:24 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/72e6a66c-9326-4a32-8caa-a7de12f11dd0_1344x768.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This benchmarking test evaluated the performance of vLLM against HuggingFace Transformers for serving the Discovery-Qwen2.5-7B model&#8212;a fine-tuned multilingual model designed for local business discovery. The tests were conducted on an NVIDIA RTX A5000 GPU with 24GB VRAM, measuring throughput, latency, memory efficiency, and scalability under various conditions. The results demonstrate that vLLM delivers transformative performance improvements that make the difference between a research prototype and a production-ready deployment.</em></p><div><hr></div><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;baa1357b-7ed9-45ae-acda-b5345f380c09&quot;,&quot;duration&quot;:null}"></div><div><hr></div><h3>Why Compare vLLM with HuggingFace Transformers?</h3><p>HuggingFace Transformers is the de facto standard library for working with large language models. It provides an accessible, well-documented interface that most ML practitioners use for training, fine-tuning, and inference. The library has facilitated access to state-of-the-art models and serves as the foundation for countless AI applications. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading ValueCurve! Subscribe for free to receive new posts and apply support.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>However, HuggingFace was designed primarily as a research and development tool, not as a production inference engine optimized for serving real users at scale. When deploying LLMs in production&#8212;especially for applications serving multiple concurrent users&#8212;HuggingFace&#8217;s straightforward approach becomes a significant bottleneck. </p><ul><li><p>Each request is processed sequentially </p></li><li><p>GPU memory is allocated statically regardless of actual usage</p></li><li><p>There&#8217;s no optimization for common patterns like shared system prompts. </p></li></ul><p>These limitations compound quickly: a system that feels responsive during development becomes frustratingly slow under real-world load. This is precisely where vLLM enters the picture.  vLLM is purpose-built for high-throughput LLM serving in production environments. It introduces several architectural innovations that fundamentally change how inference is performed: </p><ul><li><p><strong>PagedAttention</strong> for efficient memory management,</p></li><li><p><strong>Continuous batching </strong>for handling concurrent requests dynamically, and </p></li><li><p><strong>Prefix caching</strong> for scenarios with repeated prompt patterns. </p></li></ul><p>The benchmark validates whether these claimed improvements translate to real-world performance gains using a fine-tuned Qwen 2.5 7B model. The following 5 tests were conducted : </p><h3>[1] Throughput Comparison (Sequential vs Batched vs vLLM)</h3><p>This is the fundamental comparison that answers the core question: how much faster is vLLM? We measured how many tokens per second each approach can generate when processing 20 identical prompts. Throughput directly translates to cost and user experience. Higher throughput means serving more users with the same hardware, reducing infrastructure costs, and delivering faster responses. In a production environment, throughput determines how many concurrent users your system can handle before response times become unacceptable. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!OkPN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!OkPN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 424w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 848w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 1272w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!OkPN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png" width="1900" height="397" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/aea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:397,&quot;width&quot;:1900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:64268,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F70c4e3dc-391e-41fb-93bf-1479d2fb9ab2_1900x438.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!OkPN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 424w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 848w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 1272w, https://substackcdn.com/image/fetch/$s_!OkPN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Faea583b2-2703-4406-a9e6-1d9fed21d500_1900x397.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>The results are striking and validate vLLM&#8217;s core value proposition. HuggingFace sequential processing&#8212;the default approach most developers use when following tutorials and documentation&#8212;achieved only 40.9 tokens per second. vLLM delivered 868.5 tokens per second, a 21.2x improvement. Even compared to HuggingFace&#8217;s batched inference (which requires additional code complexity and careful tuning), vLLM was 1.46x faster while being simpler to deploy and configure.</p><h3>[2]  Maximum Concurrent Sequences (KV Cache Efficiency)</h3><p>This test pushed the system to handle increasing numbers of simultaneous requests (1, 10, 25, 50, 100, 150, 200) to identify the breaking point and understand scaling behaviour. Real-world applications don&#8217;t serve one user at a time. A local business assistant might hundreds of users querying simultaneously during peak hours. The ability to handle concurrent requests without crashing, running out of memory, or degrading severely determines whether the system is genuinely production-ready or merely a demo. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!az7F!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!az7F!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 424w, https://substackcdn.com/image/fetch/$s_!az7F!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 848w, https://substackcdn.com/image/fetch/$s_!az7F!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 1272w, https://substackcdn.com/image/fetch/$s_!az7F!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!az7F!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png" width="1900" height="464" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:464,&quot;width&quot;:1900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:63126,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd72b8650-cabc-4458-a8dd-73b7047b11ec_1900x527.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!az7F!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 424w, https://substackcdn.com/image/fetch/$s_!az7F!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 848w, https://substackcdn.com/image/fetch/$s_!az7F!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 1272w, https://substackcdn.com/image/fetch/$s_!az7F!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F30f3e8bf-74c1-4a62-9d94-4fba574b2bf4_1900x464.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>vLLM successfully handled all <strong>200 concurrent sequences</strong> without failure. More importantly, throughput increased with concurrency&#8212;a counterintuitive result that demonstrates vLLM&#8217;s efficient batching strategy. The system achieved peak throughput of 3,184 tokens per second at maximum concurrency, nearly 70x faster than single-request HuggingFace performance. This remarkable scaling is possible because of <strong>PagedAttention</strong>, vLLM&#8217;s innovative memory management system. Traditional transformers allocate fixed memory blocks for each sequence&#8217;s key-value cache, leading to fragmentation and waste when sequences vary in length. PagedAttention allocates memory in small pages on-demand, similar to how operating systems manage virtual memory. This eliminates waste and allows significantly more sequences to run concurrently within the same GPU memory budget.</p><h3>[3] Prefix Caching Impact</h3><p>This test compared performance with and without prefix caching enabled, using <strong>100 queries</strong> that shared a common system prompt&#8212;a scenario typical for chatbot and assistant applications. Production LLM applications typically use system prompts&#8212;detailed instructions that define the assistant&#8217;s behavior, personality, and constraints. For the fine-tuned model, every single query includes context about being a &#8220;helpful AI assistant for local business discovery &#8221; that understands Hindi, English, Marathi, and Hinglish. Processing this identical prefix repeatedly for every user query wastes substantial computation. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ctNF!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ctNF!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 424w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 848w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 1272w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ctNF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png" width="1900" height="322" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:322,&quot;width&quot;:1900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:51996,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F73d6fb60-8478-48e4-9732-67fcd1b91f73_1900x353.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ctNF!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 424w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 848w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 1272w, https://substackcdn.com/image/fetch/$s_!ctNF!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7895472c-4dd4-46be-be76-959204c6bd2f_1900x322.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>Enabling prefix caching delivered a <strong>32% speedup</strong> with zero code changes required. For the Discovery model&#8217;s specific use case&#8212;where every query shares the same system prompt defining its locality-focused, multilingual behavior&#8212;this optimization directly reduces latency and operational cost. Over thousands of daily queries, this improvement compounds into significant infrastructure savings and noticeably snappier user experiences.</p><h3>[4] Concurrency Scaling</h3><p>This test systematically measured how throughput and latency change as concurrent requests increase from <strong>1 to 100</strong>. Understanding the scaling curve helps with capacity planning and setting realistic expectations. Does performance degrade gracefully under load, or does it collapse catastrophically? At what point do you need additional hardware? </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9j6D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9j6D!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 424w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 848w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 1272w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9j6D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png" width="1900" height="555" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:555,&quot;width&quot;:1900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:78366,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee64a14a-39f0-4319-a73b-44c579c6d68f_1900x614.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9j6D!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 424w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 848w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 1272w, https://substackcdn.com/image/fetch/$s_!9j6D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8afad452-5afb-42bf-a84e-418f93b9b62e_1900x555.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The results demonstrate <strong>excellent scaling characteristics</strong><em> </em>that defy naive expectations. <strong>Throughput increased nearly linearly with concurrency </strong>up to about 50 users, then continued improving at a diminishing but still positive rate. Remarkably, per-request latency dropped from 2.8 seconds (single request) to an <strong>effective 42ms per request</strong> when processing 100 concurrent requests in 4.2 seconds total. </p><p>This counterintuitive behavior demonstrates <strong>continuous batching working exactly as designed</strong>: the GPU stays fully utilized processing multiple requests simultaneously, and individual requests benefit from efficient batched matrix operations.</p><h3>[5] Chunked Prefill</h3><p>This test evaluated chunked prefill performance with long-context prompts containing approximately <strong>1,128 tokens</strong> each. Long prompts can cause latency spikes as the system must process the entire input before generating any output tokens. Chunked prefill addresses this by breaking prefill computation into smaller pieces, reducing time-to-first-token and improving perceived responsiveness. </p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BZPG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BZPG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 424w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 848w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 1272w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BZPG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png" width="1900" height="316" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:316,&quot;width&quot;:1900,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:42637,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe173327f-fb6f-4868-aafa-fb3b4289b9c7_1900x353.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BZPG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 424w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 848w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 1272w, https://substackcdn.com/image/fetch/$s_!BZPG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2cb3b4db-08ef-4ab6-9486-6d17749b4dc3_1900x316.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>For this moderate context length, chunked prefill showed <strong>minimal difference (~1%). </strong>This optimization becomes substantially more important with very long contexts exceeding <strong>4,000 tokens</strong>. For typical Discovery model queries, which involve relatively short user questions, this feature is less critical but remains available when processing longer documents or conversation histories.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DVTi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DVTi!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 424w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 848w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 1272w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DVTi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png" width="1503" height="511" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:511,&quot;width&quot;:1503,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:88018,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.ai/i/181052284?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c3e4cb5-ac78-46b1-a2e6-79b27874a8c3_1589x592.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DVTi!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 424w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 848w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 1272w, https://substackcdn.com/image/fetch/$s_!DVTi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F13a7739c-6985-445d-8257-8cc1be05fa8b_1503x511.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3><strong>Conclusion</strong></h3><p>The benchmark comprehensively validates vLLM&#8217;s performance claim against HuggingFace Transformers. </p><p><em>For the fine-tuned Qwen2.5-7B  model specifically, these results demonstrate that a single A5000 GPU can serve a substantial user base effectively. The prefix caching optimization aligns perfectly with the model&#8217;s architectural use case, where every query shares the same system prompt defining its local business assistant persona. </em></p><p><em>The choice to benchmark against HuggingFace Transformers provides a meaningful baseline that practitioners universally understand. vLLM isn&#8217;t incrementally better&#8212;it&#8217;s an order of magnitude faster, representing the difference between a functional prototype and a genuinely production-ready system capable of serving real users reliably.</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading ValueCurve! Subscribe for free to receive new posts and apply support.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Exploratory Data Analysis: Unveiling Hidden Insights Through Data Visualization]]></title><description><![CDATA[EDA transforms numbers into narratives, revealing patterns and relationships that would otherwise remain hidden in spreadsheets.]]></description><link>https://on.valuecurve.ai/p/exploratory-data-analysis-unveiling</link><guid isPermaLink="false">https://on.valuecurve.ai/p/exploratory-data-analysis-unveiling</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Wed, 19 Nov 2025 03:31:04 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/8fc70078-ff76-4f0f-bf29-3fbb23a41f71_3000x2001.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>In the age of big data, the ability to extract meaningful insights from raw information has become invaluable. Exploratory Data Analysis (EDA) serves as the critical first step in any data science project, allowing analysts to understand, clean, and visualize data before diving into complex modeling. When combined with effective data visualization techniques, EDA transforms numbers into narratives, revealing patterns and relationships that would otherwise remain hidden in spreadsheets.</em></p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;88fe3e0d-794d-4fb0-aa4a-8b063c009e54&quot;,&quot;duration&quot;:null}"></div><div><hr></div><h3>What is Exploratory Data Analysis?</h3><p>Exploratory Data Analysis is an approach for examining datasets and to summarize their main characteristics, often using visual methods. Pioneered by statistician John Tukey in the 1970s, EDA emphasizes the importance of looking at data before making assumptions or building models. Unlike confirmatory analysis, which tests specific hypotheses, EDA is an open-ended process of discovery.</p><p>The primary objectives of EDA include:</p><ul><li><p>Understanding the structure and distribution of data</p></li><li><p>Detecting outliers and anomalies</p></li><li><p>Identifying patterns, trends, and relationships between variables</p></li><li><p>Testing underlying assumptions</p></li><li><p>Selecting appropriate models for further analysis</p></li></ul><p>EDA is not a rigid set of procedures but rather a state of mind that encourages curiosity and thorough investigation. It asks questions like: What does the data tell us? What doesn&#8217;t it tell us? What patterns emerge? What surprises exist?</p><h3>The Power of Data Visualization in EDA</h3><p>Data visualization transforms abstract numbers into concrete visual representations, making complex data accessible and understandable. Our brains process visual information  a lot faster than the text, making visualization an indispensable tool for EDA. Well-designed visualizations can reveal trends, outliers, and relationships that might take hours to discover through statistical analysis alone.</p><p><strong>Key visualization techniques in EDA include:</strong></p><p> <strong>Histograms and Distribution Plots</strong>: Reveal the shape and spread of data</p><p><strong> Scatter Plots: </strong>Uncover relationships between variables</p><p><strong> Box Plots:</strong> Identify outliers and compare distributions across groups</p><p> <strong>Correlation Heatmaps: </strong>Display relationships between multiple variables</p><p><strong> Time Series Plots</strong>: Track changes over time</p><p> <strong>Animated Visualizations:</strong> Show evolution of patterns across dimensions</p><div><hr></div><h3>Understanding the Gapminder Dataset</h3><p>The <a href="https://www.gapminder.org">Gapminder</a> dataset, curated by the <strong>Gapminder Foundation,</strong> provides a fascinating lens through which to explore global development trends. It contains data for 142 countries from 1952 to 2007, tracking three critical metrics:</p><p><strong>Life Expectancy</strong></p><p>This metric measures the average number of years a person is expected to live at birth. In our dataset, life expectancy ranges from a sobering 23.6 years (Rwanda, 1992, during the genocide) to an impressive 82.6 years (Japan, 2007). This dramatic range immediately tells a story about global health disparities and the impact of conflict on human welfare.</p><p><strong>Population</strong></p><p>Population figures range from 60,011 (Sao Tome and Principe) to over 1.3 billion (China, 2007). This metric helps us understand demographic pressures, economic potential, and resource allocation challenges facing different nations.</p><p><strong>GDP per Capita</strong></p><p>Measured in international dollars, GDP per capita ranges from $241.17 to $113,523.13 (Kuwait during an oil boom). This economic indicator reveals the vast wealth inequality between nations and provides context for understanding quality of life differences.</p><h3>Key Statistical Metrics in EDA</h3><p><strong>Measures of Central Tendency</strong></p><ul><li><p><strong>Mean : </strong>The average provides a quick snapshot but can be skewed by outliers</p></li><li><p><strong>Median : </strong>The middle value is more robust to extreme values</p></li><li><p><strong>Mode :</strong> The most frequent value, particularly useful for categorical data</p></li></ul><p>For the Gapminder dataset, comparing mean and median GDP per capita reveals positive skewness&#8212;a few wealthy nations pull the average higher than the median, indicating wealth concentration.</p><p><strong>Measures of Dispersion</strong></p><ul><li><p><strong>Standard Deviation : </strong>Quantifies variation around the mean</p></li><li><p><strong>Range : </strong>The difference between maximum and minimum values</p></li><li><p><strong>Interquartile Range (IQR) :</strong> The spread of the middle 50% of data, robust to outliers</p></li><li><p><strong>Coefficient of Variation (CV) : </strong> Enables comparison of variability across different scales</p></li></ul><p>The CV is particularly valuable when comparing metrics with different units. For instance, while population has enormous absolute variation, its CV reveals whether this variation is proportionally larger than life expectancy&#8217;s variation.</p><p><strong>Percentiles and Quartiles</strong></p><p>Percentiles divide data into 100 equal parts, providing detailed distribution information. The 25th, 50th (median), and 75th percentiles are especially important, forming the basis of box plots and revealing data symmetry or skewness.</p><p><strong>Correlation Analysis</strong></p><p>Correlation coefficients (ranging from -1 to +1) measure the strength and direction of linear relationships between variables. In the Gapminder data, life expectancy and GDP per capita show strong positive correlation (typically &gt; 0.7), suggesting that wealthier nations tend to have longer-lived populations. However, it&#8217;s crucial to remember that correlation does not imply causation.</p><div><hr></div><h3>Advantages of Systematic EDA</h3><p><strong>Early Problem Detection</strong></p><p>EDA reveals data quality issues before they corrupt analysis results. Missing values, incorrect data types, duplicate records, and inconsistent formatting become apparent through summary statistics and visualizations.</p><p><strong>Insight Generation</strong></p><p>Unexpected patterns often emerge during EDA. For example, examining the Gapminder data might reveal that some countries experienced dramatic life expectancy drops during specific years, prompting investigation into historical events like wars or epidemics.</p><p><strong>Communication Enhancement</strong></p><p>Visualizations transcend language barriers and technical expertise levels. A well-designed animated scatter plot showing the evolution of life expectancy versus GDP over time can convey decades of development history in seconds, making data accessible to stakeholders at all levels.</p><p><strong>Model Selection Guidance</strong></p><p>Understanding data distributions, relationships, and outliers helps select appropriate analytical methods. For instance, discovering non-linear relationships between variables suggests the need for polynomial regression rather than simple linear models.</p><h3>Common Pitfalls in Data Analysis</h3><p><strong>Confirmation Bias</strong></p><p>Analysts sometimes unconsciously seek patterns that confirm pre-existing beliefs while ignoring contradictory evidence. EDA should be approached with an open mind, allowing data to speak rather than forcing it into predetermined narratives.</p><p><strong>Over-reliance on Means</strong></p><p>The mean can be misleading when data is skewed or contains outliers. Always examine medians and full distributions. For GDP data, the mean might suggest average prosperity while hiding extreme inequality.</p><p><strong>Ignoring Context</strong></p><p>Statistical patterns without domain knowledge can lead to absurd conclusions. A correlation between ice cream sales and drowning deaths doesn&#8217;t mean ice cream causes drowning&#8212;both increase during summer.</p><p><strong>Misleading Visualizations</strong></p><p>Poor chart choices, manipulated axes, cherry-picked data ranges, and inappropriate color schemes can distort reality. Always ensure visualizations accurately represent data without exaggeration or omission.</p><p><strong>Correlation-Causation Confusion</strong></p><p>Perhaps the most dangerous pitfall is inferring causation from correlation. While Gapminder shows strong correlation between GDP and life expectancy, this doesn&#8217;t prove that increasing GDP directly causes longer lives. Confounding variables like education, healthcare access, and sanitation play crucial roles.</p><p><strong>Neglecting Outliers</strong></p><p>While outliers can distort analyses, they often contain valuable information. Rwanda&#8217;s drastically low life expectancy in 1992 isn&#8217;t noise to be removed&#8212;it&#8217;s a crucial data point reflecting historical tragedy that demands acknowledgment and investigation.</p><p><strong>Analysis Paralysis</strong></p><p>EDA is meant to be iterative and exploratory, but endless exploration without moving toward conclusions wastes resources. Setting clear objectives and timelines prevents perpetual analysis without action.</p><h3>Conclusion</h3><p><em>Exploratory Data Analysis, powered by thoughtful visualization, transforms raw data into actionable insights. The Gapminder dataset exemplifies how systematic exploration of metrics like life expectancy, population, and GDP per capita can reveal global development patterns, inequalities, and trends. By understanding key statistical measures&#8212;from means and medians to correlations and percentiles&#8212;and avoiding common analytical pitfalls, data analysts can extract genuine value from information.</em></p><p><em>In our data-driven world, EDA skills are not merely technical competencies but essential literacies for making informed decisions. Whether examining global health trends, business performance, or scientific phenomena, the principles remain constant: approach data with curiosity, visualize thoughtfully, measure carefully, and always question assumptions. Through disciplined EDA, we transform data from mere numbers into knowledge that drives understanding and progress.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[Fast Prototyping with Streamlit & Gen AI Tools]]></title><description><![CDATA[Streamlit, an open-source Python library, makes this possible]]></description><link>https://on.valuecurve.ai/p/fast-prototyping-with-streamlit-and</link><guid isPermaLink="false">https://on.valuecurve.ai/p/fast-prototyping-with-streamlit-and</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 19 Oct 2025 03:31:32 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!NVI5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Prototyping is the bridge between an idea and a working product. In the world of generative AI, where tools and expectations evolve quickly, the ability to move from concept to demo in hours&#8212;not weeks&#8212;is a real advantage. Streamlit, an open-source Python library, makes this possible. It allows you to turn a few lines of code into interactive, shareable web apps without needing front-end expertise.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>This post draws from the <strong><a href="https://www.google.com/search?client=safari&amp;sca_esv=29fb23f7ce376d27&amp;rls=en&amp;sxsrf=AE3TifM3VSasBnpMoxh6kB3RR3_DCkm4Yg:1759489165873&amp;q=Fast+Prototyping+with+Streamlit&amp;source=lnms&amp;fbs=AIIjpHwdlVWI4oi2g38E8_BbusNm3pTf6ItdW8-u0JVVBgXow2SS4XfWu_GDEb99WFnlrQTRreI6irPtfZJtDa4EEIgg37xdYGx3qn9unJJAquZz8WtKP8Kf0TFOreY3xEqie4ng9v3Jjl55fYppyAVSeNqtNMX-hP0YsEVR8UT5CEKA3fw2492143UKUQH5rm3X3UVJ1uDgTGmNMINiAyslUFltr3ZPMazhRYK-JCElXLa4gCd7tZ4&amp;sa=X&amp;ved=2ahUKEwiD_N7174eQAxX7yDgGHfJoEKkQ0pQJegQICRAB&amp;biw=1421&amp;bih=758&amp;dpr=2">Fast Prototyping of GenAI Apps with Streamlit</a></strong> course by <strong><a href="https://www.deeplearning.ai">DeepLearning.AI</a></strong> and <a href="https://www.snowflake.com/en/">Snowflake</a> . This course is taught by <strong><a href="https://www.linkedin.com/in/chanin-nantasenamat/">Chanin Nantasenamat</a></strong> , Sr Developer Advocate also known as the Data Professor. </p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!NVI5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!NVI5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 424w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 848w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 1272w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!NVI5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png" width="1080" height="638" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:638,&quot;width&quot;:1080,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Fast Prototyping of GenAI Apps with Streamlit - DeepLearning.AI&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Fast Prototyping of GenAI Apps with Streamlit - DeepLearning.AI" title="Fast Prototyping of GenAI Apps with Streamlit - DeepLearning.AI" srcset="https://substackcdn.com/image/fetch/$s_!NVI5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 424w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 848w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 1272w, https://substackcdn.com/image/fetch/$s_!NVI5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F63379fce-54d3-4dd8-b4a6-5a6e5f4570ec_1080x638.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>Why Fast Prototyping Matters</h4><p>Ideas often lose momentum when they stay in documents or slide decks. In AI, where experimentation is key, the faster you can test and share, the better. Rapid prototyping helps you:</p><ul><li><p><strong>Validate assumptions</strong>: Instead of debating whether an idea works, you can show it.</p></li><li><p><strong>Gather feedback early</strong>: A working demo sparks more useful conversations than abstract descriptions.</p></li><li><p><strong>Iterate quickly</strong>: You can refine based on real interactions, not speculation.</p></li><li><p><strong>Influence decisions</strong>: Stakeholders respond to tangible prototypes more than theoretical plans.</p></li></ul><p>Generative AI adds another layer: with large language models (LLMs), you can build functional prototypes with minimal code. Streamlit provides the interface to make those prototypes usable and shareable.</p><div><hr></div><h4>What is Streamlit?</h4><p>Streamlit is a Python library that transforms scripts into interactive web apps. Its appeal lies in simplicity:</p><ul><li><p><strong>Minimal code</strong>: A few lines can create buttons, sliders, text inputs, or file uploaders.</p></li><li><p><strong>No front-end skills required</strong>: You don&#8217;t need to know HTML, CSS, or JavaScript.</p></li><li><p><strong>Instant sharing</strong>: Apps can be deployed on Streamlit Community Cloud or integrated into platforms like Snowflake.</p></li></ul><p>For data scientists, researchers, and AI developers, this means you can focus on logic and models while still delivering polished, interactive demos.</p><div><hr></div><h4>The Prototyping Workflow</h4><p>The course outlines a practical workflow for building GenAI apps with Streamlit. Here&#8217;s a simplified version:</p><ol><li><p><strong>Start Small</strong><br>Begin with a minimal app&#8212;often a chatbot powered by an LLM. The goal is to get something working quickly, not to perfect it.</p></li><li><p><strong>Layer in Prompt Engineering</strong><br>Improve the quality of responses by refining prompts. Streamlit makes it easy to expose prompt variations through text boxes or dropdowns, so you can experiment interactively.</p></li><li><p><strong>Add Retrieval-Augmented Generation (RAG)</strong><br>Connect your app to external data sources. For example, you might let the chatbot answer questions based on a company&#8217;s knowledge base or a dataset stored in Snowflake.</p></li><li><p><strong>Deploy for Feedback</strong><br>Push the prototype to Streamlit Community Cloud or Snowflake. Share the link with colleagues or users and gather feedback.</p></li><li><p><strong>Iterate</strong><br>Use the feedback to refine prompts, improve data connections, or adjust the interface. Because Streamlit apps are lightweight, iteration cycles are fast.</p></li></ol><div><hr></div><h4>Example: A Simple Chatbot</h4><p>Here&#8217;s a minimal Streamlit app that connects to an LLM (using OpenAI as an example):</p><pre><code><code>import streamlit as st
import openai

# Set your OpenAI API key
openai.api_key = st.secrets[&#8221;OPENAI_API_KEY&#8221;]

st.title(&#8221;Quick Chatbot Prototype&#8221;)

# Initialize chat history
if &#8220;messages&#8221; not in st.session_state:
    st.session_state.messages = []

user_input = st.text_input(&#8221;Ask me anything:&#8221;)

if user_input:
    # Append user message to chat history
    st.session_state.messages.append({&#8221;role&#8221;: &#8220;user&#8221;, &#8220;content&#8221;: user_input})

    # Call OpenAI ChatCompletion with the full conversation history to maintain context
    response = openai.ChatCompletion.create(
        model=&#8221;gpt-3.5-turbo&#8221;,
        messages=st.session_state.messages
    )
    
    answer = response[&#8221;choices&#8221;][0][&#8221;message&#8221;][&#8221;content&#8221;]
    
    # Append assistant&#8217;s response to chat history
    st.session_state.messages.append({&#8221;role&#8221;: &#8220;assistant&#8221;, &#8220;content&#8221;: answer})
    
    # Display the assistant&#8217;s response
    st.write(answer)</code></code></pre><p>This script:</p><ul><li><p>Creates a text input box.</p></li><li><p>Sends the input to an LLM.</p></li><li><p>Displays the response.</p></li></ul><p>It&#8217;s only a few lines of code, but it produces a working chatbot you can share.</p><div><hr></div><h4>Adding Prompt Engineering</h4><p>Prompt engineering is about shaping the model&#8217;s behavior. With Streamlit, you can expose prompt templates as editable fields:</p><pre><code><code>import streamlit as st
import openai

# Set OpenAI API key securely
openai.api_key = st.secrets[&#8221;OPENAI_API_KEY&#8221;]

prompt_template = st.text_area(
    &#8220;Prompt template:&#8221;,
    &#8220;You are a helpful assistant. Answer clearly and concisely.\n\nUser: {question}\nAssistant:&#8221;
)

user_input = st.text_input(&#8221;Ask me anything:&#8221;)

if user_input:
    prompt = prompt_template.format(question=user_input)
    response = openai.Completion.create(
        model=&#8221;text-davinci-003&#8221;,
        prompt=prompt,
        max_tokens=200
    )
    st.write(response[&#8221;choices&#8221;][0][&#8221;text&#8221;])</code></code></pre><p>Now you can experiment with different instructions without changing the code.</p><div><hr></div><h4>Adding RAG (Retrieval-Augmented Generation)</h4><p>RAG combines LLMs with external data. For example, you might let the chatbot answer based on a set of documents. A simplified version looks like this:</p><pre><code><code>import streamlit as st
import openai

openai.api_key = st.secrets[&#8221;OPENAI_API_KEY&#8221;]

prompt_template = st.text_area(
    &#8220;Prompt template:&#8221;,
    &#8220;You are a helpful assistant. Answer clearly and concisely.\n\nUser: {question}\nAssistant:&#8221;
)

user_input = st.text_input(&#8221;Ask me anything:&#8221;)

if user_input:
    prompt = prompt_template.format(question=user_input)
    response = openai.Completion.create(
        model=&#8221;text-davinci-003&#8221;,
        prompt=prompt,
        max_tokens=200
    )
    st.write(response[&#8221;choices&#8221;][0][&#8221;text&#8221;])</code></code></pre><p>This setup allows the chatbot to ground its answers in your own dataset, making it more useful for specific domains.</p><div><hr></div><h4>Deployment</h4><p>When building a prototype, keep things simple and focus on the main interaction&#8212;don&#8217;t add unnecessary features. Make it easy for users to adjust prompts, settings, or data inputs. Share your work early and improve it quickly through feedback. If your prototype handles sensitive data, make sure it runs in a secure environment.</p><p>Once your prototype works locally, you can deploy it:</p><ul><li><p><strong>Streamlit Community Cloud</strong>: Free and simple for sharing demos.</p></li><li><p><strong>Snowflake + Streamlit</strong>: For secure, production-ready environments with enterprise data.</p></li></ul><p>Deployment is as simple as pushing your code to GitHub and linking it to Streamlit Cloud.</p><div><hr></div><h4>Wrapping Up</h4><p><em>Streamlit <strong>doesn&#8217;t</strong> replace production systems, but it gives you a powerful way to explore, test, and communicate ideas. In the fast-moving world of generative AI, that speed of exploration is often the difference between leading and lagging.</em></p><p><em>Rapid prototyping with Streamlit is less about building polished products and more about accelerating learning. By lowering the barrier to creating interactive GenAI apps, it allows individuals and teams to validate ideas quickly, gather feedback, and refine direction.</em></p><p><em>In practice, this means fewer stalled discussions and more tangible progress. Whether you&#8217;re experimenting with a chatbot, a summarizer, or a data assistant, the workflow is the same: start small, iterate fast, and share early.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Tools Shaping the Future of AI Development]]></title><description><![CDATA[From compact reasoning models to full-stack agent platforms and desktop-scale supercomputers tools driving innovation in AI reasoning, auditing, deployment, and experimentation.]]></description><link>https://on.valuecurve.ai/p/tools-shaping-the-future-of-ai-development</link><guid isPermaLink="false">https://on.valuecurve.ai/p/tools-shaping-the-future-of-ai-development</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Wed, 15 Oct 2025 03:31:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!RwDP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>This collection highlights six advanced AI tools and platforms driving innovation in reasoning, agent development, auditing, and deployment. </em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!RwDP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!RwDP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 424w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 848w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 1272w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!RwDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png" width="720" height="394" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ee87a19f-b427-45d2-b494-826672f7960d_720x394.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:394,&quot;width&quot;:720,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Samsung Logo | Brand Identity | Samsung US&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Samsung Logo | Brand Identity | Samsung US" title="Samsung Logo | Brand Identity | Samsung US" srcset="https://substackcdn.com/image/fetch/$s_!RwDP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 424w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 848w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 1272w, https://substackcdn.com/image/fetch/$s_!RwDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fee87a19f-b427-45d2-b494-826672f7960d_720x394.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>[1] Samsung SAIL Montreal&#8217;s</strong> <strong><a href="https://github.com/SamsungSAILMontreal/TinyRecursiveModels">Tiny Recursive Models (TRM) </a></strong>represent a minimalist approach to AI reasoning, using a small neural network with just 7 million parameters to tackle complex tasks. TRM challenges the reliance on large language models by showing that recursive reasoning&#8212;where the model iteratively refines its answers&#8212;can yield strong results with minimal compute. The model begins with an embedded question, answer, and latent state, then updates its latent state and answer over multiple steps to improve accuracy. </p><p>TRM achieved notable scores on <em>ARC-AGI</em> benchmarks (45% on ARC-AGI-1 and 8% on ARC-AGI-2), levels typically reached by much larger models. It avoids complex theoretical constructs, focusing instead on practical recursion. The codebase is open-source, built in Python with CUDA and PyTorch, and has been tested on datasets like ARC-AGI, Sudoku-Extreme, and Maze-Hard. This work underscores the potential of compact models in AI safety and reasoning research.</p><p><strong>[2] </strong>The <strong><a href="https://docs.claude.com/en/api/agent-sdk/overview">Claude Agent SDK</a></strong> is a developer toolkit for building and deploying custom AI agents. It supports both TypeScript (for Node.js and web apps) and Python (for data science), with streaming and single input modes. The SDK is built on the Claude Code agent harness, offering prompt caching and performance optimization. Key features include automatic context management, error handling, session control, and monitoring&#8212;essential for production use. Agents can use built-in tools for file operations, code execution, and web search, and connect to external services via the Model Context Protocol (MCP). </p><p>Developers can define agent roles using System Prompts and control tool access with allowedTools or disallowedTools. The SDK supports Claude Code features like Subagents, Hooks, and Slash Commands through file-based configuration. It enables various agent types, including coding agents (e.g., SRE bots, code reviewers) and business agents (e.g., legal assistants, finance advisors, support bots). Authentication requires an API key via the <strong>ANTHROPIC_API_KEY</strong> environment variable, with optional support for Amazon Bedrock and Google Vertex AI. Overall, the SDK provides a structured, extensible foundation for building reliable, task-specific AI agents.</p><div><hr></div><p><strong>[3]</strong> <strong><a href="https://alignment.anthropic.com/2025/petri/">Petri (Parallel Exploration Tool for Risky Interactions)</a></strong> is an open-source framework for auditing AI models by automating behavior testing across diverse scenarios. Built on the UK AI Security Institute&#8217;s Inspect framework, Petri supports most model APIs and reduces the manual effort needed for alignment evaluations. Its process includes four steps: forming hypotheses about risky behaviors, writing seed instructions for audit scenarios, running automated assessments via an auditor agent and a judge, and iterating based on transcript scores. The auditor agent simulates interactions with the target model, adjusting its approach dynamically. The judge scores transcripts across multiple dimensions, extracting highlights and summaries to identify misaligned behaviors. </p><p>Petri has surfaced issues like <em>deception</em>, <em>oversight subversion</em>, and <em>whistleblowing</em> in frontier models. In pilot tests, Claude Sonnet 4.5 and GPT-5 showed strong safety profiles, while others like Gemini 2.5 Pro and Grok-4 raised concerns. Limitations include realism gaps in transcripts, reliance on human-generated hypotheses, auditor model constraints, and judge subjectivity. Petri is extensible and includes 111 sample seed instructions, enabling rapid exploration and customization of audit tools and scoring systems.</p><p><strong>[4] <a href="https://docs.claude.com/en/api/agent-sdk/overview">AgentKit</a></strong> is <strong>OpenAI&#8217;s</strong> full-stack platform for building, deploying, and optimizing AI agents, replacing earlier tools like the Agents SDK and Responses API. It includes Agent Builder, a visual canvas for designing multi-agent workflows with drag-and-drop nodes, preview runs, tool integration, and version control. ChatKit enables seamless embedding of chat-based agents into products or websites, handling streaming, thread management, and customizable UI. The Connector Registry provides enterprises with a centralized panel to manage data and tool integrations, including pre-built connectors and third-party Model Context Protocols (MCPs). Guardrails offer a modular safety layer to detect jailbreaks and protect sensitive data. </p><p>AgentKit also expands evaluation capabilities through Evals, supporting dataset creation, trace grading, prompt optimization, and third-party model assessment. For advanced tuning, it includes Reinforcement Fine-Tuning (RFT), available on o4-mini and in beta for GPT-5, allowing custom tool call training and grader configuration. As of October 2025, ChatKit and Evals are generally available, while Agent Builder remains in beta. AgentKit is designed to streamline agent development for both individual developers and enterprise teams.</p><div><hr></div><p><strong>[5]</strong> <strong><a href="https://www.docker.com/blog/ibm-granite-4-0-models-now-available-on-docker-hub/">IBM Granite 4.0</a></strong> is a family of open-source language models available on the Docker Hub model catalog, enabling developers to quickly build generative AI applications using Docker Model Runner. Designed for speed, flexibility, and cost-efficiency, Granite 4.0 combines enterprise-grade performance with a lightweight footprint, making it ideal for local prototyping and scalable deployment. Licensed under Apache 2.0, the models are customizable and commercially usable. </p><p>Technically, Granite 4.0 uses a hybrid architecture that merges Mamba-2&#8217;s linear efficiency with transformer precision, and select models apply a Mixture of Experts (MoE) strategy to reduce memory usage by over 70%. It also supports extremely long context lengths&#8212;up to 128,000 tokens&#8212;limited only by hardware. The model lineup includes H-Small (32B total, ~9B active) for RAG and agents on L4 GPUs, H-Tiny (7B total, ~1B active) for edge deployment on RTX 3060, H-Micro and Micro (3B dense) for ultra-light or fallback use cases. These variants support development on accessible hardware. With Docker Model Runner, developers can deploy models via an OpenAI-compatible API for tasks like document analysis, advanced RAG systems, multi-agent workflows, and edge AI applications.</p><p><strong>[6] </strong>The <strong><a href="https://www.asus.com/networking-iot-servers/desktop-ai-supercomputer/ultra-small-ai-supercomputers/asus-ascent-gx10/">ASUS Ascent GX10 </a></strong>is a compact desktop AI supercomputer built on <strong>NVIDIA</strong> <strong>DGX&#8482; Spark </strong>and powered by the NVIDIA&#174; GB10 Grace Blackwell Superchip. It delivers 1 petaFLOP of AI performance using FP4 and features a fifth-generation Blackwell GPU, 128 GB of LPDDR5x unified memory, and a high-performance 20-core Arm CPU for fast training and inference. With NVIDIA&#174; NVLink&#8482;-C2C and ConnectX-7 networking, it supports scalable multi-GX10 setups for handling models like Llama 3.1 with 405 billion parameters. Designed for minimal footprint and high reliability, the GX10 includes QuietFlow Cooling, dual vapor chambers, and passes MIL-STD 810H durability tests. It supports up to five 4K displays and NVIDIA DLSS 4 for enhanced visuals. </p><p>The system runs NVIDIA DGX&#8482; OS with Ubuntu and comes preloaded with CUDA, PyTorch, TensorFlow, Jupyter, TensorRT, NIM&#8482;, and Blueprints. It enables development and fine-tuning of models up to 200 billion parameters and supports workloads across generative AI, computer vision, analytics, and simulation. Models can be transitioned to DGX Cloud or other infrastructures with minimal code changes. Connectivity includes multiple USB-C ports, HDMI 2.1b, 10 GbE LAN, and a ConnectX-7 NIC, making it a powerful, developer-optimized platform for AI experimentation and deployment.</p><div><hr></div><p><em>Together, these platforms reflect a shift toward modular, efficient, and developer-accessible AI infrastructure. From minimalist reasoning models and scalable agent frameworks to open-source language models and compact supercomputers, the ecosystem is evolving to support rapid prototyping, safe deployment, and high-performance experimentation across diverse AI workloads. These innovations empower researchers, developers, and enterprises to build more capable, aligned, and accessible AI systems.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Understanding Machine Learning Pipeline]]></title><description><![CDATA[A machine learning pipeline transforms raw inputs into reliable, production-ready models through distinct and clear stages]]></description><link>https://on.valuecurve.ai/p/understanding-the-machine-learning</link><guid isPermaLink="false">https://on.valuecurve.ai/p/understanding-the-machine-learning</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 12 Oct 2025 03:31:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!VEC8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>A machine learning pipeline transforms raw inputs into reliable, production-ready models through distinct and clear stages. Each phase adds structure and checks, from gathering data to keeping a live model healthy. This guide walks you through Data Collection, Feature Engineering, Model Training, Evaluation, Deployment, Monitoring, and Maintenance in straightforward terms. </em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>The purpose of this post is to provide readers with a <em><strong>mental map</strong></em> for any ML project. Let&#8217;s dive in.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!VEC8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!VEC8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 424w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 848w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 1272w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!VEC8!,w_2400,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png" width="974" height="487" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;large&quot;,&quot;height&quot;:728,&quot;width&quot;:1456,&quot;resizeWidth&quot;:974,&quot;bytes&quot;:197445,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/175117038?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-large" alt="" srcset="https://substackcdn.com/image/fetch/$s_!VEC8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 424w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 848w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 1272w, https://substackcdn.com/image/fetch/$s_!VEC8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb5273654-4381-40a9-83c2-1dc2eb56bf5a_2641x1320.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4><strong>1. Data Collection</strong></h4><p>The process begins with gathering relevant data from various sources. These might include databases, logs, CSV files, APIs, and sensor feeds. It&#8217;s crucial to define what is needed before pulling in every field. Objectives and data schema should be sketched out early to save time in later stages. Privacy rules and security policies must be respected during record ingestion. Each dataset should be tagged with metadata that captures when, where, and how it arrived. This context supports traceability and reproducibility in analysis and modeling. A solid foundation of clean, well-understood inputs reduces surprises down the road.</p><p><strong>2. Feature Engineering</strong></p><p>Raw attributes rarely arrive in the shape and scale a model requires. Feature engineering transforms these inputs into meaningful variables. Numeric fields might be normalized, categorical ones encoded, or parts of timestamps extracted. Domain expertise helps craft interaction terms or rolling averages for time-series data. For text tasks, tokenization or embeddings convert words into numbers. Techniques like PCA or feature selection help manage high dimensionality. Each crafted feature should carry a clear rationale about its role in prediction. Well-engineered features often boost performance more than complex algorithms.</p><div><hr></div><p><strong>3. Model Training</strong></p><p>With features in place, a model is trained to learn underlying patterns. The process starts by selecting an algorithm family&#8212;linear models, decision trees, or neural networks&#8212;based on the problem. Data is split into training and validation sets to catch overfitting before it sneaks in. Hyperparameter tuning is automated with grid search, random search, or Bayesian methods. Training metrics like loss curves and convergence behavior are monitored. Each experiment&#8217;s configuration and results are tracked with an experiment management tool. Automating repeatable training pipelines prevents manual errors and ensures reproducibility. A robust training loop lays the groundwork for reliable predictions.</p><p><strong>4. Evaluation</strong></p><p>Evaluation measures how well a model performs on unseen data. Metrics should align with real-world goals&#8212;accuracy, precision/recall, RMSE, AUC, or custom business KPIs. A hold-out test set that never touched training or tuning is reserved to avoid leakage. Performance is visualized with confusion matrices or ROC curves to spot strengths and weaknesses. Error cases are drilled into to understand model blind spots. Bias is checked by comparing results across different subgroups. Findings are documented clearly to guide stakeholders on strengths and limitations. Thorough validation builds confidence before production rollout.</p><div><hr></div><p><strong>5. Deployment</strong></p><p>Deployment turns a validated model into a usable service or batch task. The model is containerized with Docker or packaged for serverless platforms. Inference logic is wrapped in a simple API or integrated into an existing data pipeline. Model artifacts and API definitions are version-controlled to track changes over time. Deployment pipelines are automated to eliminate manual hand-offs and reduce errors. Endpoints are load-tested to verify they meet latency and throughput requirements. Fallback logic or circuit breakers are included to handle unexpected failures gracefully. Smooth, repeatable deployment processes make going live painless and predictable.</p><p><strong>6. Monitoring</strong></p><p>Once in production, a model enters a dynamic environment that can drift over time. Input feature distributions are monitored to catch covariate drift early. Performance metrics are tracked against expected baselines or business KPIs continuously. Logs are instrumented for latency, error rates, and resource usage. Alerts are set to trigger when metrics deviate beyond safe thresholds. Diagnostic dashboards are used to spot anomalies at a glance. Regular health checks ensure the service remains stable under varying loads.</p><div><hr></div><p><strong>7. Maintenance</strong></p><p>Maintenance keeps a model effective as data and requirements evolve. Over time, input distributions shift, feature relevance changes, and new business rules emerge. A proactive maintenance plan ensures the pipeline doesn&#8217;t silently degrade.</p><p>A retraining cadence is established based on data volume and observed drift. Every model artifact&#8212;training code, feature transformations, and weights&#8212;is versioned to allow rollback or comparison of historic performance.</p><p>Tests are automated to validate feature pipelines and inference logic after updates. New data sources or labels are incorporated when business needs shift, then their impact is validated through the evaluation suite. Obsolete features are rotated out and their effect on performance monitored to prevent bloat.</p><p><strong>Wrapping Up</strong></p><p><em>A robust machine learning pipeline weaves together data collection, feature engineering, model training, evaluation, deployment, monitoring, and maintenance. Each stage builds on the last to create a transparent, reproducible path from raw inputs to reliable predictions. Skipping any phase invites hidden bias, untraceable errors, and brittle systems.</em></p><p><em>When this end-to-end flow is followed; the right data is captured with clear ownership. That data is distilled into features that carry meaningful signals. Models are trained under controlled conditions and validated on truly unseen cases. Deployment is handled with repeatable pipelines, version control, and safety nets. Monitoring is continuous, adapting to real-world drift rather than hoping nothing breaks. </em></p><p><em>With this structure in place, Engineers can iterate on features or algorithms the team can focus on building next-generation capabilities. More importantly, Stakeholders can see clear metrics that connect model outputs to business impact.</em></p><div><hr></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Matplotlib for Data Visualization ]]></title><description><![CDATA[Python Library for Data Analysis & Visualization]]></description><link>https://on.valuecurve.ai/p/matplotlib-for-data-visualisation</link><guid isPermaLink="false">https://on.valuecurve.ai/p/matplotlib-for-data-visualisation</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Fri, 10 Oct 2025 13:35:14 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!TEc7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Matplotlib is a versatile Python library for generating both simple and complex visualizations. It mimics MATLAB&#8217;s plotting interface through </em><code>pyplot</code><em> and integrates seamlessly with NumPy and Pandas. Whether you need quick exploratory plots or publication&#8209;ready figures, Matplotlib offers fine&#8209;grained control over every element of your chart.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TEc7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TEc7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 424w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 848w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 1272w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TEc7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png" width="983" height="545" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:545,&quot;width&quot;:983,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:94443,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/175012708?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!TEc7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 424w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 848w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 1272w, https://substackcdn.com/image/fetch/$s_!TEc7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d79ef1c-5ae1-48ee-b311-98f9ee730c03_983x545.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>Core Plotting and Customization </h4><p><a href="https://www.linkedin.com/in/keithgalli/">Keith Galli</a>,  a MIT graduate has published over 100 videos on Computer Science, Programming &amp; Data Analysis. Keith created Python plotting <a href="https://www.youtube.com/watch?v=0P7QnIQDBJY">crash course</a> using matplotlib for beginners, few years ago. This course begins with the fundamentals of plotting before moving into customization.</p><ul><li><p><strong>Setup</strong>: Import the libraries: <code>matplotlib.pyplot as plt</code>, <code>numpy as np</code>, and <code>pandas as pd</code>.</p></li><li><p><strong>Basic Line Graph</strong>: Create a simple line graph with <code>plt.plot(x, y)</code> and display it using <code>plt.show()</code>.</p></li><li><p><strong>Labels and Titles</strong>: Add context with <code>plt.title()</code>, <code>plt.xlabel()</code>, and <code>plt.ylabel()</code>.</p></li><li><p><strong>Tick Marks</strong>: Adjust axis tick values with <code>plt.xticks()</code> and <code>plt.yticks()</code> for readability.</p></li><li><p><strong>Legends</strong>: Use <code>plt.legend()</code> to label multiple datasets; it will auto-place the legend.</p></li><li><p><strong>Line Styling</strong>: Control appearance with parameters such as <code>color</code>, <code>linewidth</code>, <code>marker</code>, <code>markersize</code>, and <code>linestyle</code>.</p></li><li><p><strong>Shorthand Notation</strong>: Combine style settings quickly, e.g. <code>&#8216;b.-&#8217;</code> for a blue line with dots.</p></li><li><p><strong>Font Customization</strong>: Pass a <code>fontdict</code> to titles or labels to change size, family, and weight.</p></li><li><p><strong>Figure Size and Resolution</strong>: Define figure dimensions with <code>plt.figure(figsize=(w, h), dpi=300)</code>.</p></li><li><p><strong>Saving a Plot</strong>: Export graphics with <code>plt.savefig(&#8217;plot.png&#8217;, dpi=300)</code>.</p></li></ul><div><hr></div><h4>Variety of Plot Types</h4><p>The course covers multiple visualization methods beyond line graphs:</p><ul><li><p><strong>Line Graphs</strong> &#8212; e.g., gas price trends.</p></li><li><p><strong>Bar Charts</strong> &#8212; <code>plt.bar()</code>, with bar patterns via <code>set_hatch()</code>.</p></li><li><p><strong>Histograms</strong> &#8212; <code>plt.hist()</code> to show data distributions, with adjustable bins.</p></li><li><p><strong>Pie Charts</strong> &#8212; <code>plt.pie()</code>, including labels, percentages (<code>autopct</code>), custom colors, and slice separation (<code>explode</code>).</p></li><li><p><strong>Box and Whiskers Plots</strong> &#8212; <code>plt.boxplot()</code> for comparing distributions, with options to style boxes.</p></li></ul><div><hr></div><h4>Real-World Application with Pandas</h4><p>Integration with Pandas is a key feature:</p><ul><li><p><strong>Loading Data</strong>: Read CSVs with <code>pd.read_csv()</code>.</p></li><li><p><strong>Example Datasets</strong>:</p><ul><li><p>Gas Prices &#8212; country-level prices over time.</p></li><li><p>FIFA Player Stats &#8212; player attributes from the FIFA game.</p></li></ul></li><li><p><strong>Data Selection</strong>: Extract subsets before plotting, such as <code>gas[&#8217;USA&#8217;]</code> or filtering with <code>FIFA.loc[FIFA[&#8217;Club&#8217;] == &#8216;FC Barcelona&#8217;]</code>.</p></li></ul><div><hr></div><h4>Teaching Approach &amp; Takeaway</h4><p>The course keeps a practical focus:</p><ul><li><p><strong>Documentation First</strong>: Suggests to use the official Matplotlib docs for functions and parameters.</p></li><li><p><strong>Problem Solving</strong>: Search Google or Stack Overflow for specific issues, as is common in real development. This can now be augmented with AI tools. </p></li></ul><p>This course covers the essentials of Matplotlib: how to create plots, customize them, work with multiple chart types, and apply them to real datasets through Pandas. It&#8217;s a practical introduction for anyone learning data visualization in Python. The datasets required to complete the matplotlib course can be downloaded from <a href="https://github.com/KeithGalli/matplotlib_tutorial">github</a> . Additionally, you can also download <a href="https://matplotlib.org/cheatsheets/cheatsheets.pdf">cheatsheets</a> for reference from the official site. </p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Domestic Chips and Large Models Driving China’s AI Advancement]]></title><description><![CDATA[Signal Six - A curated summary of the latest in Data + AI, to keep you updated about the fast paced Technology Landscape.]]></description><link>https://on.valuecurve.ai/p/alibabas-qwen3-max-trillion-parameter</link><guid isPermaLink="false">https://on.valuecurve.ai/p/alibabas-qwen3-max-trillion-parameter</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Wed, 08 Oct 2025 03:31:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!gdHo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>China&#8217;s AI sector is rapidly advancing on two fronts: self-reliant hardware and globally competitive AI models. Huawei, Alibaba, Tencent, DeepSeek, Zhipu, and ByteDance announced breakthroughs across chips, large-scale LLMs, and multimodal systems&#8212;positioning the country&#8217;s ecosystem to close benchmark gaps with the U.S. while driving down costs of development and deployment.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><h4>1. Huawei: Ascend 910C Chip and Roadmap</h4><p>Huawei began shipping its <a href="https://www.huaweicentral.com/huawei-silently-testing-ascend-910c-ai-chip-to-rival-nvidia-report/">Ascend 910C chip</a> to major firms like Baidu and ByteDance for testing. Claimed to match Nvidia&#8217;s H100 in performance, this domestic GPU breakthrough supports self-reliant AI infrastructure, with initial deliveries enabling faster training of large models on Chinese hardware. Huawei also announced its 2025-2028 <a href="https://www.huawei.com/en/news/2025/9/hc-xu-keynote-speech">Ascend roadmap</a>, incorporating self-developed HBM memory for 2 PFLOPS per chip to ensure supply-chain independence, with clusters matching NVIDIA&#8217;s aggregate power despite single-chip gaps.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gdHo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gdHo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 424w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 848w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 1272w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gdHo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png" width="1456" height="620" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:620,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!gdHo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 424w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 848w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 1272w, https://substackcdn.com/image/fetch/$s_!gdHo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1bdfdb48-ff4d-4cbf-a4aa-e97824c348d1_2350x1000.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h4>2. Alibaba: Qwen3-Max Model Leadership</h4><p>Building on this hardware foundation, Alibaba made headlines with its <a href="https://qwen.ai/blog?id=241398b9cd6353de490b0f82806c7848c5d2777d&amp;from=research.latest-advancements-list">Qwen3-Max</a>, a 1T+ parameter model that topped Hugging Face rankings. Backed by a &#165;380B AI infrastructure investment, it surpasses Llama 3.1 in math and coding tasks, supports 29+ languages, and offers a massive 2M-token context window. The Qwen3-Max-Instruct variant targets coding, instruction-following, and agent applications, underpinning rapid enterprise adoption and closing the U.S.-China benchmark gap.</p><div><hr></div><h4>3. Tencent: Hunyuan Image 3.0 Multimodal Model</h4><p>Meanwhile, Tencent contributed advancements on the multimodal front by releasing <a href="https://huggingface.co/tencent/HunyuanImage-3.0">Hunyuan Image 3.0,</a> an 80-billion-parameter Mixture-of-Experts model. Unlike standard diffusion models, it uses a unified autoregressive framework to tightly fuse text and image generation, excelling in photorealism and creative control while broadening accessibility for AI content creation across industries.</p><h4>4. DeepSeek: V3.2-Exp Efficiency Upgrade </h4><p>Efficiency and cost-effectiveness received attention as well, with DeepSeek unveiling its <a href="https://huggingface.co/deepseek-ai/DeepSeek-V3.2-Exp">V3.2-Exp</a> intermediate model. Building on prior versions, it introduces DeepSeek Sparse Attention to enhance long-context processing while halving API costs. Optimized for Chinese native chips and supporting CUDA cross-compatibility, it positions itself as a flexible, cost-competitive alternative in the LLM market.</p><div><hr></div><h4>5. Zhipu AI: GLM-4.6 and Claude Migration Plan</h4><p>Zhipu AI also advanced with <a href="https://docs.z.ai/guides/llm/glm-4.6">GLM-4.6</a>, enhancing long-context reasoning and agent workflows while supporting massive input and output token windows. Available via API and open weights for local deployment, Zhipu&#8217;s release comes with a migration plan targeting Anthropic Claude users&#8212;offering a lower-cost, higher-usage alternative designed to attract a large user base.</p><h4>6. ByteDance: Doubao 1.6-Vision Multimodal Launch</h4><p>Finally, ByteDance&#8217;s cloud and AI unit Volcengine launched <a href="https://www.volcengine.com/product/doubao">Doubao 1.6-Vision</a>, a multimodal model introducing tool-calling for complex visual tasks. It delivers advanced visual reasoning and image operation capabilities like cropping and annotation, while slashing deployment costs nearly in half compared to its predecessor, enabling broader affordability and scalability for visual AI services.</p><div><hr></div><p><em>Taken together, these announcements show a coordinated acceleration of China&#8217;s AI ecosystem&#8212;expanding from hardware foundations to frontier models and cost-efficient applications. The focus on domestic chip capability, long-context LLMs, multimodal systems, and accessible pricing signals more than just catch-up; it highlights a maturing ecosystem increasingly able to set competitive benchmarks on its own terms.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Object-Oriented Concepts in Python ]]></title><description><![CDATA[Representing real-world entities using classes and objects]]></description><link>https://on.valuecurve.ai/p/object-oriented-concepts-in-python</link><guid isPermaLink="false">https://on.valuecurve.ai/p/object-oriented-concepts-in-python</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Tue, 07 Oct 2025 12:29:22 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Object-Oriented Programming (OOP) in Python provides a structured way to represent real-world entities by combining <strong>data (attributes)</strong> and <strong>behavior (methods)</strong> in reusable components called classes. By designing and instantiating classes, you create objects that encapsulate both state and functionality, making code more modular and expressive.</em></p><div><hr></div><h4>Classes and Objects</h4><ul><li><p><strong>Object</strong>: Everything in Python is an object, and every object has a type that defines its data and behaviors.</p></li><li><p><strong>Class</strong>: A class acts as a blueprint for creating objects.</p></li><li><p><strong>Instance (Object)</strong>: A specific creation based on that class blueprint.</p></li></ul><p><strong>Defining a Class</strong></p><p>You define a class using the <code>class</code> keyword, followed by the class name. In Python 3, it&#8217;s common to write <code>class Coordinate:</code> instead of <code>class Coordinate(object):</code> because all classes inherit from <code>object</code> by default.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="9000" height="4320" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:4320,&quot;width&quot;:9000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;a neon circle with a snake on it&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="a neon circle with a snake on it" title="a neon circle with a snake on it" srcset="https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1690683790356-c1edb75e3df7?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw3fHxweXRob24lMjBwcm9ncmFtbWluZ3xlbnwwfHx8fDE3NTk0MDgyOTh8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Attributes</strong></p><ul><li><p><strong>Data attributes</strong> (instance variables): Values stored per object, typically defined in <code>__init__</code>.</p></li><li><p><strong>Methods</strong>: Functions defined inside a class that operate on its instance data.</p></li></ul><p><code>__init__</code><strong> Method and </strong><code>self</code></p><ul><li><p><strong>Constructor (</strong><code>__init__</code><strong>)</strong>: Called automatically when a new instance is created; initializes the object&#8217;s attributes.</p></li><li><p><code>self</code><strong> parameter</strong>: Refers to the instance on which a method is called. It is passed implicitly; <code>self</code> is the convention, not a Python keyword.</p></li></ul><p><strong>Data Attributes</strong></p><p>Defined inside <code>__init__</code> with <code>self</code> (e.g., <code>self.x = xval</code>). These persist for the object&#8217;s lifetime.</p><p><strong>Dunder Methods and Operator Overloading</strong></p><ul><li><p><strong>Special Methods (Dunder Methods)</strong>: Begin and end with double underscores, such as <code>__init__</code>, <code>__str__</code>, <code>__add__</code>.</p></li><li><p><code>__str__</code>: Defines a human-readable string (used in <code>print()</code> calls). Falls back to <code>__repr__</code> if not defined.</p></li><li><p><strong>Operator Overloading</strong>: By defining methods like <code>__add__</code>, <code>__eq__</code>, and <code>__mul__</code>, you control how operators work on your custom objects.</p></li></ul><p><strong>Inheritance</strong></p><ul><li><p><strong>Concept</strong>: A child (subclass) can inherit attributes and methods from a parent (superclass), e.g., <code>class Cat(Animal):</code>.</p></li><li><p><strong>Overriding</strong>: A child can replace a parent&#8217;s method with its own definition.</p></li><li><p><strong>Extending</strong>: A child can add new attributes/methods.</p></li><li><p><strong>Method Resolution Order (MRO)</strong>: Python looks for attributes in the current class, then parent(s), following a consistent MRO for multiple inheritance.</p></li></ul><p><strong>Class Variables</strong></p><ul><li><p>Variables defined inside the class but outside methods are shared by all instances.</p></li><li><p>Be careful: assigning to <code>self.var</code> in an instance will create/override an <strong>instance attribute</strong>, shadowing the class variable.</p><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/p/object-oriented-concepts-in-python?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thanks for reading ValueCurve! This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/p/object-oriented-concepts-in-python?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/p/object-oriented-concepts-in-python?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div></li></ul><div><hr></div><h4><strong>Python Code Example: Defining a Coordinate Class</strong></h4><p>This code block demonstrates how to define a class, create a constructor (<code>__init__</code>), define data attributes, and implement a procedural method (<code>distance</code>). It also shows the power of the <code>__str__</code> method.</p><pre><code>import math

class Coordinate:
    # Class Variables (shared by all instances)
    dimension = 2
    instance_count = 0

    def __init__(self, x_val, y_val):
        # Instance Variables
        self.x = x_val
        self.y = y_val
        Coordinate.instance_count += 1

    def distance(self, other_coord):
        &#8220;&#8221;&#8220;Calculates the Euclidean distance to another Coordinate.&#8221;&#8220;&#8221;
        x_diff = self.x - other_coord.x
        y_diff = self.y - other_coord.y
        return math.sqrt(x_diff**2 + y_diff**2)

    def __str__(self):
        &#8220;&#8221;&#8220;Defines a human-readable string representation.&#8221;&#8220;&#8221;
        return f&#8221;&lt;{self.x}, {self.y}&gt;&#8221;

    def __add__(self, other_coord):
        &#8220;&#8221;&#8220;Overloads &#8216;+&#8217; operator to add coordinates (vector addition).&#8221;&#8220;&#8221;
        new_x = self.x + other_coord.x
        new_y = self.y + other_coord.y
        return Coordinate(new_x, new_y)

# --- Creating Objects ---
point1 = Coordinate(3, 4)
point2 = Coordinate(0, 0)
point3 = Coordinate(10, 5)

# --- Accessing Attributes and Methods ---
print(f&#8221;Point 1 X-attribute: {point1.x}&#8221;)
print(f&#8221;Distance between {point1} and {point2}: {point1.distance(point2):.2f}&#8221;)
print(f&#8221;Point 3 representation: {point3}&#8221;)

# --- Operator Overloading ---
point_sum = point1 + point3
print(f&#8221;Point Sum (point1 + point3): {point_sum}&#8221;)

# --- Class Variables ---
print(f&#8221;Total Coordinate objects created: {Coordinate.instance_count}&#8221;)
print(f&#8221;Dimension (class variable): {point1.dimension}&#8221;)</code></pre><div class="directMessage button" data-attrs="{&quot;userId&quot;:4255616,&quot;userName&quot;:&quot;Sarfaraz Mulla&quot;,&quot;canDm&quot;:null,&quot;dmUpgradeOptions&quot;:null,&quot;isEditorNode&quot;:true}" data-component-name="DirectMessageToDOM"></div><div><hr></div><h4>Summary and Best Practices</h4><p>OOP provides the tools to manage large, dynamic projects by modeling complexity with clarity, structure, and consistency. Mastering these fundamentals is the key to advancing beyond basic scripting into professional software development.</p><p>The central mechanisms are:</p><ul><li><p><strong>Instantiation</strong>: The <code>__init__</code> method and the <code>self</code> parameter construct new objects and bind instance-specific data.</p></li><li><p><strong>Specialization</strong>: Inheritance allows creation of new classes that extend or override the behavior of their parents, promoting code reuse and clarifying relationships.</p></li><li><p><strong>Integration</strong>: Dunder methods like <code>__str__</code> and <code>__add__</code> let custom objects integrate naturally with Python&#8217;s built-in functions and operators, producing more readable and &#8220;Pythonic&#8221; code.</p></li></ul><p><strong>Best practices include:</strong></p><ul><li><p>Designing classes around a clear, single responsibility to keep them maintainable and understandable.</p></li><li><p>Using <code>__str__</code> for human-readable output and <code>__repr__</code> for accurate, developer-facing representations.</p></li><li><p>Managing class variables carefully, recognizing when values should be shared at the class level versus kept per instance.</p></li><li><p>Favoring <strong>composition</strong> (building complex objects from simpler ones) when it improves clarity, rather than relying heavily on inheritance hierarchies.</p></li><li><p>Applying operator overloading selectively, only when it genuinely improves code clarity and aligns with intuitive object behavior.</p></li></ul><p>OOP is powerful in Python but should be used thoughtfully, alongside procedural and functional styles, to build flexible, scalable, and maintainable systems.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Lists in Python are indispensable ]]></title><description><![CDATA[Python Basics : Data Structure]]></description><link>https://on.valuecurve.ai/p/lists-in-python</link><guid isPermaLink="false">https://on.valuecurve.ai/p/lists-in-python</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Mon, 06 Oct 2025 12:15:28 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!SQmT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Python lists are a compound data type, similar to tuples, that can be populated with objects of any type, including integers, strings, other lists, or a mix of different types. Lists are created using square brackets </em><code>[]</code><em>.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SQmT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SQmT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 424w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 848w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SQmT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg" width="688" height="647.2296296296296" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1016,&quot;width&quot;:1080,&quot;resizeWidth&quot;:688,&quot;bytes&quot;:297973,&quot;alt&quot;:&quot;table&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="table" title="table" srcset="https://substackcdn.com/image/fetch/$s_!SQmT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 424w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 848w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!SQmT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3dd560b-740d-4f15-b439-0ef8e91d4bca_1080x1016.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@jontyson">Jon Tyson</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><div><hr></div><h4>Key Characteristics and Operations</h4><p><strong>Mutability</strong>: The most significant feature of a list is its mutability. Unlike immutable objects such as strings and tuples, lists can be changed in memory after they are created. This means you can alter a list object itself without creating a new copy. This is a key difference from tuples, where reassigning a variable to a new tuple creates a new object in memory, leaving the original unchanged. Because of their mutable nature, lists are highly efficient for managing dynamic data, like a list of employees or students, as changes don&#8217;t require creating entirely new copies of large data structures.</p><p><strong>Modifying Elements</strong>: You can change an element at a specific index using assignment syntax, such as <code>L[index] = 5</code>.</p><p><strong>Adding Elements</strong>:</p><ul><li><p><code>append()</code>: Adds a single item to the end of a list, mutating the original list. Returns <code>None</code>.</p></li><li><p><code>extend()</code>: Adds all elements from another list to the end of the original.</p></li></ul><p><strong>Removing Elements</strong>:</p><ul><li><p><code>remove(element)</code>: Removes the first occurrence of a specified value.</p></li><li><p><code>del L[index]</code>: Deletes the element at a specific index.</p></li><li><p><code>pop()</code>: Removes and returns the last element.</p></li><li><p><code>clear()</code>: Removes all elements, keeping the same list object in memory.</p></li></ul><p><strong>Sorting and Reversing</strong>:</p><ul><li><p><code>L.sort()</code>: Sorts the list in place, returning <code>None</code>.</p></li><li><p><code>L.reverse()</code>: Reverses the list in place.</p></li><li><p><code>sorted(L)</code>: Returns a new, sorted copy without altering the original.</p></li></ul><p><strong>Iteration</strong>:</p><ul><li><p>Iterate directly with <code>for e in L:</code> (more Pythonic).</p></li><li><p>Use <code>for i in range(len(L)):</code> if you need to mutate elements by index.</p></li><li><p>Mutating a list while iterating directly can cause skipped elements; safer: iterate over a copy.</p></li></ul><p><strong>Copying (Cloning)</strong>:</p><ul><li><p>Shallow copy: <code>L_copy = L[:]</code> (top-level only).</p></li><li><p>For independent nested structures, use <code>copy.deepcopy(L)</code>.</p></li></ul><p><strong>List Comprehension</strong>:</p><ul><li><p>Pattern: <code>[expression for item in sequence if condition]</code>.</p></li><li><p>A concise way to replace explicit loop + append structures.</p></li></ul><div><hr></div><h4>Summary and Best Practices</h4><p>In summary, the Python list is a versatile and indispensable collection. To use it effectively and avoid common pitfalls, keep these core principles in mind:</p><ul><li><p><strong>Embrace Mutability</strong>: Use in-place methods like <code>append()</code>, <code>extend()</code>, and <code>sort()</code> for efficient data alteration, but be mindful that these methods always mutate the original list.</p></li><li><p><strong>Beware of Aliasing</strong>: Use shallow copy techniques such as <code>L[:]</code>, <code>list(L)</code>, or <code>L.copy()</code> when you need an independent copy of a list&#8217;s top-level structure. This prevents unintentional modifications to the original list. If the list contains nested mutable structures (e.g., lists of lists), use <code>copy.deepcopy(L)</code> when you need a fully independent copy of both top-level and nested objects.</p></li><li><p><strong>Use List Comprehensions</strong>: Adopt the <code>[expression for item in sequence if condition]</code>pattern to write cleaner, faster, and more readable code when generating new lists.</p></li><li><p><strong>Prefer Safe Mutations During Iteration</strong>: Mutating a list while iterating can cause skipped elements. Iterate by index or over a copy for safety.</p></li><li><p><strong>Understand Sorting Options</strong>: Use <code>list.sort()</code> when you want to sort in place for performance. Use <code>sorted(list)</code> when you need a new sorted copy without affecting the original.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share ValueCurve</span></a></p></li></ul>]]></content:encoded></item><item><title><![CDATA[Sequoia estimates a 10 Trillion AI Revolution ]]></title><description><![CDATA[How Sequoia Capital envisions the future of AI unfolding, and the investment opportunities they have identified.]]></description><link>https://on.valuecurve.ai/p/sequoias-estimates-a-10-trillion</link><guid isPermaLink="false">https://on.valuecurve.ai/p/sequoias-estimates-a-10-trillion</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 05 Oct 2025 03:31:30 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/203381e2-be2e-4a62-ad8a-0f32fa0fc846_940x288.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><a href="https://www.sequoiacap.com">Sequoia Capital</a>, a leading Silicon Valley venture capital firm known for investing in early and growth-stage technology companies, recently released a <a href="https://youtu.be/yoycgOMq1tI?si=Li-DOkIvC3b8AjZR">presentation</a> on AI, describing it as a $10 trillion &#8220;cognitive revolution.&#8221; Their core thesis positions this transformation as being as significant&#8212;if not more so&#8212;than the Industrial Revolution.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AFYt!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AFYt!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 424w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 848w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 1272w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AFYt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic" width="940" height="288" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:288,&quot;width&quot;:940,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:12949,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/174907006?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AFYt!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 424w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 848w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 1272w, https://substackcdn.com/image/fetch/$s_!AFYt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6b8fb458-e904-42c2-b9ac-8e57b2272026_940x288.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><p><a href="https://www.sequoiacap.com/people/konstantine-buhler/">Konstantine Buhler</a> from Sequoia draws a compelling parallel between milestones of the industrial era&#8212;the steam engine, the first factory system, and the assembly line&#8212;and key developments in the AI era, such as the introduction of the first GPU (the GeForce 256 in 1999) and the establishment of the first &#8220;AI factory&#8221; in 2016. Just as it took 144 years to perfect the factory assembly line, AI is now entering a crucial phase of specialization.</p><p>For a complex system like AI to mature, it must integrate general-purpose components (like foundational AI models) with highly specialized subsystems and labor. Sequoia views today&#8217;s startups as the primary drivers of this specialization, building targeted applications atop general AI technologies.</p><div><hr></div><h4>How the AI Future will unfold</h4><p>Sequoia expects AI to catalyze a major economic shift by automating and expanding the market for knowledge work and services. They identify the $10 trillion US services market as the primary opportunity. Comparing it with the evolution of software-as-a-service (SaaS)&#8212;which expanded the on-premise software market&#8212;AI is anticipated not only to grow the market share but also to enlarge the services industry itself. This expansion could give rise to large, standalone public companies centered on AI, akin to the industrial giants of past eras.</p><p><strong>Investment Trends Sequoia is watching</strong></p><ol><li><p><strong>Leverage Over Uncertainty:</strong> Work is shifting from tasks with low leverage and high certainty to tasks where AI delivers massive leverage (100%+), albeit with less predictable outcomes. For example, a salesperson might deploy hundreds of AI agents to monitor accounts and intervene only as needed.</p></li><li><p><strong>Real-World Measurement:</strong> The benchmark for AI performance has moved beyond academic datasets like ImageNet. Sequoia points to real-world validation, such as AI hackers competing live on platforms like HackerOne, as a more meaningful measure.</p></li><li><p><strong>Reinforcement Learning:</strong> Previously limited to research labs, reinforcement learning is now employed by startups to train open-source models, especially in coding and software development.</p></li><li><p><strong>AI in the Physical World:</strong> AI is expanding beyond software, powering robotics, manufacturing processes, and quality assurance systems.</p></li><li><p><strong>Compute as the New Production Function:</strong> The emerging key metric is &#8220;flops per knowledge worker.&#8221; Sequoia&#8217;s portfolio companies forecast a 10x to 10,000x increase in compute consumption as workers begin leveraging hundreds or thousands of AI agents.</p></li></ol><div><hr></div><h4>Investment Themes for the next 12&#8211;18 Months</h4><ol><li><p><strong>Persistent Memory:</strong> A significant unsolved challenge in AI is long-term memory&#8212;both in retaining conversational context and preserving an agent&#8217;s identity. Current approaches, including vector databases and extended context windows, remain insufficient &amp;  have to be addressed. </p></li><li><p><strong>Seamless Communication Protocols:</strong> Just as TCP/IP enabled the internet, new protocols are needed for AI agents to communicate and collaborate effectively. This advancement would allow agents to perform complex tasks, such as researching, comparing prices, and completing purchases autonomously.</p></li><li><p><strong>AI Voice:</strong> Advances in fidelity and reduced latency now make AI voice suitable for real-time conversations. Applications include consumer-facing virtual companions and enterprise solutions like logistics coordination and trading desks.</p></li><li><p><strong>AI Security:</strong> There is a substantial opportunity to embed security at every layer of the AI stack&#8212;from model development to end-user protection. Sequoia envisions a future with hundreds of AI security agents safeguarding each human and AI agent.</p></li><li><p><strong>Open Source:</strong> Despite its precarious position, Sequoia considers open-source AI essential for a free and equitable future. They aim to support models that remain accessible and competitive, counterbalancing proprietary dominance.</p></li></ol><p>Sequoia&#8217;s view is that AI is entering a phase of deep specialization, with startups leading the development of the next wave of infrastructure and applications. This moment represents an economically transformative era with long-term opportunities across services, compute, security, and open-source ecosystems.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Python Dictionaries are a Compound Data Type ]]></title><description><![CDATA[Mapping related data in a key value pair]]></description><link>https://on.valuecurve.ai/p/python-dictionaries-are-a-compound</link><guid isPermaLink="false">https://on.valuecurve.ai/p/python-dictionaries-are-a-compound</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Fri, 03 Oct 2025 11:36:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!gIBP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Python dictionaries are a compound data type designed to store and manage related data by mapping a custom index, called a <strong>key</strong>, to a corresponding <strong>value</strong>. This structure is analogous to a book dictionary that maps a word to its definition.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>The need for dictionaries arises from the limitations of using lists for complex data storage. Let&#8217;s consider this with a student grades example:</p><ul><li><p>Using parallel lists (one for names, one for grades) is cumbersome. To find a student&#8217;s grade, you first have to find their index in the <code>names</code> list and then use that same index to access the <code>grades</code> list. This process becomes increasingly messy as more data (like quiz or problem set scores) is added, requiring more parallel lists that must be kept in sync.</p></li><li><p>Using a &#8220;master list&#8221; of nested lists also proves to be complex and hard to read, requiring nested loops to search for specific information.</p></li></ul><p>Dictionaries solve this by providing a direct mapping from a key (e.g., a student&#8217;s name) to a value (e.g., their grade or a more complex data structure containing all their information). An entry in a dictionary is a <strong>key-value pair</strong>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!gIBP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!gIBP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 424w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 848w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 1272w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!gIBP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic" width="1456" height="885" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:885,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1078580,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/175029068?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!gIBP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 424w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 848w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 1272w, https://substackcdn.com/image/fetch/$s_!gIBP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb31031e8-e0bd-4d64-8dcc-2bdd7fbbd8c7_4300x2614.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h4>Creating, Accessing, and Modifying Dictionaries</h4><p><strong>Creation</strong></p><ul><li><p>Dictionaries are created using <strong>curly braces </strong><code>{}</code>. An empty dictionary is created with <code>d = {}</code>.</p></li><li><p>Entries are defined as <code>key: value</code>, and multiple entries are separated by commas. For example, a dictionary mapping names to grades could be created like this: <code>grades = {&#8217;Ana&#8217;: &#8216;B&#8217;, &#8216;Matt&#8217;: &#8216;A&#8217;, &#8216;John&#8217;: &#8216;B&#8217;}</code>.</p></li></ul><p><strong>Accessing and Mutability</strong></p><ul><li><p><strong>Dictionaries are mutable objects</strong>, meaning they can be changed after they are created.</p></li><li><p>To look up a value, you use the key as an index in <strong>square brackets</strong>, similar to list indexing (e.g., <code>grades[&#8217;John&#8217;]</code> would return <code>&#8216;B&#8217;</code>).</p></li><li><p>If you try to access a key that does not exist in the dictionary, Python will raise a <code>KeyError</code> exception.</p></li><li><p>You cannot look up a key by its value directly, as values can be duplicated. You would need to write a loop to perform this reverse lookup.</p></li></ul><p><strong>Operations</strong></p><ul><li><p><strong>Adding or Modifying Entries</strong>: The same syntax is used for both adding a new entry and changing an existing one. If the key does not exist, a new entry is created. If it does exist, its value is overwritten. For example, <code>grades[&#8217;Grace&#8217;] = &#8216;A&#8217;</code> adds Grace, and <code>grades[&#8217;Grace&#8217;] = &#8216;C&#8217;</code> later changes her grade.</p></li><li><p><strong>Deleting Entries</strong>: The <code>del</code> keyword is used to remove an entry, such as <code>del grades[&#8217;Ana&#8217;]</code>. This mutates the original dictionary.</p></li><li><p><strong>Checking for a Key</strong>: The <code>in</code> operator checks for the existence of a <strong>key</strong> in a dictionary (e.g., <code>&#8216;John&#8217; in grades</code>returns <code>True</code>). It is important to note that the <code>in</code> operator <strong>only checks keys, not values</strong>.</p></li><li><p><strong>Copying</strong>: Since dictionaries are mutable, simple assignment (<code>d2 = d1</code>) creates an alias (another name for the same object). To create a shallow copy, you must use the <code>.copy()</code> method (<code>d2 = d1.copy()</code>).</p><div><hr></div></li></ul><h4>Iteration</h4><p>While recent versions of Python guarantee insertion order for dictionaries, it is best practice to write code assuming they are <strong>unordered sequences</strong> for robustness and compatibility with older versions. There are three main ways to iterate over a dictionary:</p><ul><li><p><code>.keys()</code>: Returns an iterable sequence of all the keys in the dictionary.</p></li><li><p><code>.values()</code>: Returns an iterable sequence of all the values.</p></li><li><p><code>.items()</code>: This is often the most effective method, as it returns an iterable of key-value pairs as tuples. This allows you to access both the key and value simultaneously in a loop, for example: <code>for k, v in my_dict.items():</code>.</p></li></ul><p><strong>Restrictions on Keys and Values</strong></p><ul><li><p><strong>Dictionary values</strong> can be of <strong>any type</strong>, including other mutable objects like lists or other dictionaries. Values can also be duplicated.</p></li><li><p><strong>Dictionary keys</strong> have two main restrictions:</p><ol><li><p>They must be <strong>unique</strong>. You cannot have two entries with the same key.</p></li><li><p>They must be <strong>immutable</strong> (technically, &#8220;hashable&#8221;). This means keys can be types like integers, floats, strings, tuples, and booleans, but <strong>cannot be lists or other dictionaries</strong>.</p></li></ol></li></ul><h4>Performance and Implementation: Hashing</h4><p>Following is a detailed explanation of why dictionaries offer superior performance for lookups compared to lists and why keys must be immutable :</p><ul><li><p><strong>Average-Case constant time lookup</strong>: Dictionaries are implemented using a <strong>hash table</strong>. A <strong>hash function</strong> is run on a key, which converts it into an integer. This integer is then used as a direct index to find the location of the corresponding value in the hash table (which is structured like a list in memory). Because this lookup is based on a direct calculation rather than a sequential search, accessing an item in a dictionary takes <strong>constant time (&#920;</strong><code>(1)</code><strong>) on average</strong>. This provides a massive performance advantage over searching a list, which takes linear time (<strong>&#920;</strong><code>(n)</code>).</p></li><li><p><strong>Collisions and worst-case performance</strong>: A &#8220;collision&#8221; occurs when two different keys produce the same hash value, mapping them to the same location or &#8220;bucket&#8221; in the hash table. These collisions are handled by storing the colliding entries together, often in a list-like structure within that bucket. A good hash function distributes keys uniformly, minimizing collisions. However, in the <strong>worst-case scenario</strong>, if all keys collide into the same bucket, looking up an item degrades to a linear search through that bucket, resulting in <strong>linear time (</strong><code>theta(n)</code><strong>) performance</strong>.</p></li><li><p><strong>Why keys must be immutable</strong>: The hashing mechanism requires that a key always produces the same hash value. If a key were a mutable object like a list, its contents could change after it was stored in the dictionary. This change would alter its hash value, making it impossible for the dictionary to find the original location where the value was stored.</p></li></ul><h4>Use Cases </h4><ol><li><p><strong>Storing Structured Data</strong>: Dictionaries are excellent for representing complex, structured data. The student information example was revisited to show how a nested dictionary (<code>grades[&#8217;Ana&#8217;][&#8217;ps&#8217;]</code>) provides a much cleaner and more efficient way to store and retrieve data compared to nested lists.</p></li><li><p><strong>Counting Frequencies</strong>: A common pattern is to create a &#8220;frequency dictionary&#8221; to count occurrences of items. An example shows how to build a dictionary that maps each word in a song&#8217;s lyrics to the number of times it appears.</p></li><li><p><strong>Memoization</strong>: Dictionaries are crucial for an optimization technique called memoization, which is used to speed up expensive computations, particularly in recursive functions. The Fibonacci example demonstrates this by storing previously calculated <code>fib(n)</code> values in a dictionary. Before computing a value, the function checks if the result is already in the dictionary, avoiding redundant calculations. This improved the efficiency of the Fibonacci function from exponential time to a much faster one, reducing millions of function calls to just a few dozen.</p></li></ol><div><hr></div><p><strong>Summary and Best Practices </strong></p><p>Python dictionaries are the core tool for managing unordered, dynamic data that requires fast, key-based lookup. They fundamentally map unique, hashable <strong>keys</strong> to corresponding <strong>values</strong>.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tHDP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tHDP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 424w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 848w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 1272w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tHDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png" width="1192" height="1008" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1008,&quot;width&quot;:1192,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:260619,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/175029068?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tHDP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 424w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 848w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 1272w, https://substackcdn.com/image/fetch/$s_!tHDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb4eb4ea6-b1eb-4d68-99da-837746255e3f_1192x1008.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The dictionary&#8217;s defining characteristic is its reliance on <strong>hash tables</strong>, which allows for item retrieval in <strong>average-case constant time (&#920;(1))</strong>, offering a massive performance advantage over sequential list searching.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Fundamental Programming Concepts in Python]]></title><description><![CDATA[Python is an imperative, object-oriented language where everything is an object]]></description><link>https://on.valuecurve.ai/p/fundamental-programming-concepts</link><guid isPermaLink="false">https://on.valuecurve.ai/p/fundamental-programming-concepts</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Thu, 02 Oct 2025 14:55:59 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Programming in Python involves providing the computer with a set of instructions, much like a recipe, to perform a task. This is known as imperative programming. The computer itself is not intelligent; it only follows the instructions you provide. These sequences of steps are called algorithms, which have a defined flow of control and a stopping condition.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="14467" height="9744" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:9744,&quot;width&quot;:14467,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;a white cube with a yellow and blue logo on it&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="a white cube with a yellow and blue logo on it" title="a white cube with a yellow and blue logo on it" srcset="https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1649180556628-9ba704115795?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxweXRob258ZW58MHx8fHwxNzU5NDA4MjU3fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@rubaitulazad">Rubaitul Azad</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>Learning to program is a skill that requires consistent practice. It is not just about understanding concepts but also about developing problem-solving ability&#8212;the process of translating a problem described in English into a computational solution. Following concepts should serve as  a <em><strong>memory aid</strong></em> for the beginners in Python. </p><h4>Objects, Types, and Variables</h4><ul><li><p><strong>Objects and Types</strong>: In Python, everything is an object, from numbers to functions. Each object has a <em>type</em> that defines what operations can be performed on it. You can use <code>type()</code>to check an object&#8217;s type.</p></li><li><p><strong>Scalar Data Types</strong>:</p><ul><li><p><code>int</code>: Whole numbers like <code>5</code>, <code>0</code>, <code>-100</code>.</p></li><li><p><code>float</code>: Real numbers with a decimal point, such as <code>3.27</code> or <code>2.0</code>. Because of binary representation, floating-point operations may introduce tiny rounding errors.</p></li><li><p><code>bool</code>: Truth values&#8212;<code>True</code> or <code>False</code> (case-sensitive).</p></li><li><p><code>NoneType</code>: Has only one value, <code>None</code>, used to represent the absence of a value.</p></li></ul></li><li><p><strong>Variables and Expressions</strong>:</p><ul><li><p>Variables are names bound to objects. The assignment operator <code>=</code> binds the name on the left to the evaluated value on the right.</p></li><li><p>Example: <code>area = pi * radius**2</code> first evaluates the expression and then assigns it to <code>area</code>.</p></li><li><p>Variables can be rebound to new values. For instance, <code>x = x + 1</code> calculates <code>x + 1</code> and rebinds <code>x</code> to this result.</p></li></ul></li><li><p><strong>Operators</strong>:</p><ul><li><p>Arithmetic: <code>+</code>, <code>-</code>, <code>*</code>, <code>**</code>, <code>/</code> (float division), <code>//</code> (integer division), <code>%</code> (modulo).</p></li><li><p>Comparison: <code>==</code>, <code>!=</code>, <code>&gt;</code>, <code>&lt;</code> evaluate to <code>True</code> or <code>False</code>.</p></li><li><p>Logical: <code>and</code>, <code>or</code>, <code>not</code>.</p></li></ul></li></ul><div><hr></div><h4>Input, Output, and Control Flow</h4><ul><li><p><strong>Input and Output</strong>:</p><ul><li><p><code>print()</code> displays output.</p></li><li><p><code>input()</code> reads user input as a string. Casting is often required, e.g., <code>age = int(input(&#8221;Enter age: &#8220;))</code>.</p></li><li><p><em>f-strings</em> provide formatted string interpolation: <code>print(f&#8221;Area = {area}&#8221;)</code>.</p></li></ul></li><li><p><strong>Control Flow</strong>:</p><ul><li><p><strong>Branching</strong>: <code>if</code>, <code>elif</code>, <code>else</code> allow decision-making. Indentation defines code blocks.</p></li><li><p><strong>Loops</strong>:</p><ul><li><p><code>while</code>: Repeats while a condition is <code>True</code>. Be careful to avoid infinite loops.</p></li><li><p><code>for</code>: Iterates over sequences. Example: <code>for i in range(5):</code>.</p></li></ul></li></ul></li></ul><h4>Functions</h4><ul><li><p><strong>Defining and Calling</strong>: Declared with <code>def</code> and executed when called.</p></li><li><p><strong>Parameters and Return Values</strong>: Accept inputs and return results with <code>return</code>. Functions without a return default to <code>None</code>.</p></li><li><p><strong>First-Class Objects</strong>: Functions can be assigned to variables, passed as arguments, or returned from other functions.</p></li><li><p><strong>Scope</strong>: Each function has its own local scope.</p><ul><li><p>Local variables are destroyed after the function finishes.</p></li><li><p>Functions can read variables from outer scopes but cannot modify them unless declared with <code>global</code> or <code>nonlocal</code>.</p></li></ul></li><li><p><strong>Loop Control</strong>:</p><ul><li><p><code>break</code>: Exit the loop early.</p></li><li><p><code>continue</code>: Skip to the next iteration.</p></li><li><p><code>pass</code>: Placeholder that does nothing (useful for code stubs).</p></li></ul></li></ul><div><hr></div><h4>Common Data Structures</h4><ul><li><p><strong>Strings (</strong><code>str</code><strong>)</strong>: Immutable, ordered sequence of characters.</p><ul><li><p>Created with quotes: <code>&#8220;hello&#8221;</code>.</p></li><li><p>Operations: concatenation (<code>+</code>), repetition (<code>*</code>), length (<code>len()</code>), indexing, slicing (<code>s[1:3]</code>).</p></li></ul></li><li><p><strong>Tuples (</strong><code>tuple</code><strong>)</strong>: Immutable, ordered collection of objects.</p><ul><li><p>Example: <code>(2, &#8216;MIT&#8217;, 3)</code>.</p></li><li><p>Support indexing, slicing, and unpacking (<code>x, y = y, x</code>).</p></li><li><p>Often used to return multiple values from a function.</p></li></ul></li><li><p><strong>Lists (</strong><code>list</code><strong>)</strong>: Mutable, ordered collection.</p><ul><li><p>Example: <code>[2, &#8216;a&#8217;, 4]</code>.</p></li><li><p>Elements can be modified: <code>L[0] = 5</code>.</p></li><li><p>Common methods:</p><ul><li><p><code>L.append(item)</code> &#8211; add item at end</p></li><li><p><code>L.extend(other)</code> &#8211; append all items from another list</p></li><li><p><code>L.sort()</code> &#8211; sort list in place</p></li><li><p><code>L.reverse()</code> &#8211; reverse in place</p></li><li><p><code>L.remove(item)</code> &#8211; remove first matching element</p></li></ul></li><li><p>Assigning <code>newList = oldList</code> creates an alias, not a copy. For a copy: <code>newList = oldList[:]</code>.</p></li></ul></li><li><p><strong>Dictionaries (</strong><code>dict</code><strong>)</strong>: Mutable collection of key-value pairs.</p><ul><li><p>Example: <code>grades = {&#8217;Ana&#8217;: &#8216;B&#8217;, &#8216;John&#8217;: &#8216;A&#8217;}</code>.</p></li><li><p>Keys must be immutable types (str, int, tuple).</p></li><li><p>Access: <code>grades[&#8217;Ana&#8217;]</code>.</p></li><li><p>Add or modify: <code>grades[&#8217;Grace&#8217;] = &#8216;A&#8217;</code>.</p></li><li><p>Iteration: <code>.keys()</code>, <code>.values()</code>, <code>.items()</code>.</p></li><li><p>Safe access: <code>grades.get(&#8217;Ana&#8217;, &#8216;Not Found&#8217;)</code>.</p></li></ul></li></ul><p><strong>Exception and Error Handling </strong></p><p>Python provides robust mechanisms to handle errors using <code>try</code>, <code>except</code>, and <code>finally</code> blocks. While not essential at the very beginning, learning exceptions is important once you start writing programs that deal with unpredictable input, files, or external data.</p><div><hr></div><h4>Summary &amp; Best Practices </h4><p>Python is an imperative, object-oriented language in which every entity is treated as an object. Its core components include variables, operators, control flow mechanisms, data structures, and functions.Mutability plays a key role: lists and dictionaries are mutable, while strings and tuples are immutable. Functions support abstraction, modularity, and reuse, with built-in features like default and keyword arguments.</p><p>Clean Python code is easier to read and maintain when variable names clearly reflect their purpose.  <code>For</code> loops are typically more readable than <code>while</code> loops, unless the loop depends on a specific condition and f-strings simplify string formatting by embedding expressions directly. Mutable objects like lists and dictionaries can lead to unintended side effects when shared across functions or scopes, so they require careful handling. </p><p>Deeply nested logic can obscure intent; breaking it into smaller functions improves clarity and reuse. User input should be converted and validated to prevent errors. Comments are useful when necessary, but well-structured code and meaningful names often make them redundant. By combining a solid grasp of the language&#8217;s core constructs with disciplined coding practices, developers can build systems that are both elegant and resilient.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Claude introduces Sonnet 4.5 ]]></title><description><![CDATA[Gemini Robotics, Google Looker Studio and Open AI Parental Control]]></description><link>https://on.valuecurve.ai/p/claude-introduces-sonnet-45</link><guid isPermaLink="false">https://on.valuecurve.ai/p/claude-introduces-sonnet-45</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Wed, 01 Oct 2025 03:34:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!rSGr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>Six Pieces - A curated summary of the latest  in Data + AI, to keep you updated about the fast pace Technology Landscape.</em></p><div><hr></div><p><strong>[1] Anthropic</strong> has released Claude Sonnet 4.5, its <a href="https://www.anthropic.com/news/claude-sonnet-4-5">most capable model</a> to date, designed for complex agentic workflows and high-stakes coding tasks.  The model supports a 200K token context window by default, with access to a 1 million token context in beta, enabling long-horizon tasks and large document processing.</p><p>As per Anthropic, Sonnet 4.5 demonstrates stronger performance in reasoning, code generation, and multimodal understanding. It offers a balance of intelligence and speed suitable for enterprise workloads and real-time AI experiences and the alignment has been significantly improved. The model now shows reduced tendencies toward sycophancy, deception, and power-seeking, and is more robust against prompt injection attacks. Sonnet 4.5 is available via the Claude API, Amazon Bedrock, and Google Cloud&#8217;s Vertex AI, with pricing unchanged from Sonnet 4.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!rSGr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!rSGr!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 424w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 848w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 1272w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!rSGr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png" width="1374" height="1362" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1362,&quot;width&quot;:1374,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:162275,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/174892468?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!rSGr!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 424w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 848w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 1272w, https://substackcdn.com/image/fetch/$s_!rSGr!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19367f43-859c-4e2f-b890-b6dafd55735f_1374x1362.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>[2] OpenAI </strong>has introduced <a href="https://openai.com/index/introducing-parental-controls/">parental controls</a> for ChatGPT to help families manage teen usage in a safer, age-appropriate way. Parents can link their accounts with their teen&#8217;s, customize settings, and apply safeguards such as reduced exposure to graphic content, sexual or violent roleplay, and viral challenges. Additional controls include quiet hours, disabling voice mode, memory, image generation, and opting out of model training. A notification system alerts parents if ChatGPT detects signs of potential self-harm.  As per Open AI, these features were developed in consultation with experts and advocacy groups, and are part of OpenAI&#8217;s broader effort to build toward an age prediction system that automatically applies teen-appropriate settings.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p>[3] <strong>OpenAI</strong> has launched &#8220;<em><a href="https://openai.com/index/buy-it-in-chatgpt/">Buy it in ChatGPT</a></em>,&#8221; starting with <em>Instant Checkout</em>, a feature that lets users purchase products directly within ChatGPT. Initially available for U.S. <strong>Etsy</strong> sellers and expanding soon to over a million Shopify merchants like Glossier, SKIMS, and Vuori , it supports single-item purchases with multi-item carts coming later. The feature is powered by the<em> </em><a href="https://developers.openai.com/commerce">Agentic Commerce Protocol (ACP)</a>, an open standard co-developed with Stripe. ACP enables secure, real-time communication between AI agents (like ChatGPT), buyers, and merchants, allowing ChatGPT to act as a digital shopper that facilitates transactions without leaving the chat.</p><p><strong>[4]</strong><em> </em><a href="https://deepmind.google/models/gemini-robotics/gemini-robotics-er/">Gemini Robotics-ER 1.5</a><em> </em>is <strong><a href="https://deepmind.google/discover/blog/gemini-robotics-15-brings-ai-agents-into-the-physical-world/">Google DeepMind</a></strong>&#8217;s latest embodied reasoning model, designed to serve as a high-level cognitive engine for physical agents. It integrates spatial and temporal reasoning, task planning, and progress estimation, enabling robots to interpret scenes, plan multi-step actions, and execute tasks using external tools like <em>Google Search </em>or custom APIs. The model introduces a tunable &#8220;thinking budget&#8221; to balance latency and accuracy, supports semantically grounded 2D point generation, and demonstrates improved safety by recognizing physical constraints and refusing unsafe plans. <a href="http://deepmind.google/models/gemini-robotics/gemini-robotics/">Gemini Robotics 1.5</a> refers to the broader system that combines ER 1.5 with vision-language-action (VLA) models and cross-embodiment learning to support end-to-end robotic control. Gemini Robotics-ER 1.5 is available in preview via the Gemini API in Google AI Studio, while Gemini Robotics 1.5 is currently accessible only to select partners.</p><div><hr></div><p><strong>[5] </strong><a href="https://cloud.google.com/looker-studio">Looker Studio</a> is <strong>Google Cloud</strong>&#8217;s self-service business intelligence platform for creating customizable dashboards and reports. It supports over 800 data connectors and includes features like drag-and-drop editing, prebuilt templates, report embedding, and an API for asset management. Recent updates include the launch of <em>Looker Studio Pro</em>, which adds enterprise capabilities such as team workspaces, <strong>Google Cloud </strong>project linking, and admin support for governance and access control. Pricing for <em>Looker Studio Pro</em> is <a href="https://cloud.google.com/looker-studio">$9 per user per project per month</a>, while the standard version remains free for creators and viewers.</p><p>[6]  <strong>Tilde</strong> (an EU / Baltic / Nordic AI startup) released <strong><a href="https://tilde.ai/news/tilde-releases-tildeopen-llm/">TildeOpen LLM</a></strong>, a 30-billion-parameter open model optimized for European languages and trained on the EuroHPC LUMI supercomputer.  TildeOpen was trained on the LUMI supercomputer using AMD Instinct&#8482; MI250X accelerators and supports all 24 official EU languages plus others, and is released under a permissive license (CC-BY-4.0). This bolsters Europe&#8217;s multilingual AI infrastructure, improving coverage in lesser-served languages.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item><item><title><![CDATA[Tuples are Immutable]]></title><description><![CDATA[Python Basics : Data Structures]]></description><link>https://on.valuecurve.ai/p/tuples-are-immutable</link><guid isPermaLink="false">https://on.valuecurve.ai/p/tuples-are-immutable</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Mon, 29 Sep 2025 11:26:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!nzu2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em>In Python, one of the foundational building blocks of data management is the <strong>tuple</strong>. While lists often get more attention for their flexibility, tuples play an equally important&#8212;sometimes even more efficient&#8212;role in everyday programming. The key feature that makes tuples unique is immutability. Once created, a tuple cannot be changed, offering reliability and consistency when working with data.</em></p><div><hr></div><h4><strong>Tuples as Data Structures</strong></h4><p>Tuples are a fundamental Python data structure&#8212;a way of organizing data with a focus on immutability and reliability. Unlike lists, tuples guarantee their contents won&#8217;t change, which makes them suitable in contexts such as dictionary keys, fixed records, or multiple return values</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!nzu2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!nzu2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 424w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 848w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 1272w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!nzu2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic" width="941" height="705" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:705,&quot;width&quot;:941,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:35523,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/174821330?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!nzu2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 424w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 848w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 1272w, https://substackcdn.com/image/fetch/$s_!nzu2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fca2c180b-5832-41ae-a62c-b4ac686d5426_941x705.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/p/tuples-are-immutable?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/p/tuples-are-immutable?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><h4>Tuples: Immutable Sequences</h4><p>A <em>tuple</em> is an ordered collection of values, typically enclosed in parentheses `()`. Unlike lists, tuples cannot be modified once defined.</p><pre><code>coordinates = (10, 20)
# Trying to change an element raises an error:
# coordinates[0] = 99   # TypeError: &#8216;tuple&#8217; object does not support item assignment</code></pre><p>This immutability makes tuples ideal for storing fixed collections of related data where consistency matters.</p><p><strong>Packing: Collecting Values into a Tuple</strong></p><p>Packing is a feature where multiple values are grouped together into a tuple automatically.</p><pre><code>point = 5, 7
print(point)   # Output: (5, 7)</code></pre><p>This allows for easy grouping of values without explicit tuple syntax.</p><p><strong>Unpacking: Assigning Tuple Elements</strong></p><p>Unpacking takes a tuple and assigns its elements to individual variables in one step.</p><pre><code>x, y = point
print(x, y)   # Output: 5 7</code></pre><p>Unpacking is useful when functions return multiple values or when processing tuple-based data.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><p><strong>`zip`: Combining Iterables</strong></p><p>The `zip()` function pairs elements from multiple iterables into tuples, providing a combined view of the data.</p><pre><code>names = [&#8221;Alice&#8221;, &#8220;Bob&#8221;, &#8220;Charlie&#8221;]
scores = [85, 90, 95]

zipped = zip(names, scores)
print(list(zipped))  
# Output: [(&#8217;Alice&#8217;, 85), (&#8217;Bob&#8217;, 90), (&#8217;Charlie&#8217;, 95)]</code></pre><p>This is helpful for parallel iteration over multiple sequences.</p><p><strong>`enumerate`: Indexed Iteration</strong></p><p>When you want both the index and the value from a sequence, `enumerate()` produces tuples of `(index, value)` pairs.</p><pre><code>fruits = [&#8221;apple&#8221;, &#8220;banana&#8221;, &#8220;cherry&#8221;]

for index, fruit in enumerate(fruits, start=1):
    print(index, fruit)

# Output:
# 1 apple
# 2 banana
# 3 cherry</code></pre><p><strong>Sorting with a Key</strong></p><p>Tuples can represent structured data, such as `(name, score)` pairs. When sorting collections of tuples, a sort key can determine the criterion.</p><pre><code>students = [(&#8221;Alice&#8221;, 85), (&#8221;Bob&#8221;, 90), (&#8221;Charlie&#8221;, 78)]

sorted_students = sorted(students, key=lambda student: student[1])
print(sorted_students)
# Output: [(&#8217;Charlie&#8217;, 78), (&#8217;Alice&#8217;, 85), (&#8217;Bob&#8217;, 90)]</code></pre><p>This sorts the list based on the score (the second tuple element).</p><p> Combined with Python&#8217;s powerful features like packing, unpacking, `zip`, and `enumerate`, tuples provide a simple but robust way to handle fixed collections of data efficiently and safely.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/subscribe?utm_source=menu&amp;simple=true&amp;next=https%3A%2F%2Fwww.valuecurve.co%2Fp%2Ftuples-are-immutable&quot;,&quot;text&quot;:&quot;Stay Connected&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/subscribe?utm_source=menu&amp;simple=true&amp;next=https%3A%2F%2Fwww.valuecurve.co%2Fp%2Ftuples-are-immutable"><span>Stay Connected</span></a></p>]]></content:encoded></item><item><title><![CDATA[Solving Anagrams beyond Wordplay]]></title><description><![CDATA[Practice assignment for competitive programming]]></description><link>https://on.valuecurve.ai/p/solving-anagrams-beyond-wordplay</link><guid isPermaLink="false">https://on.valuecurve.ai/p/solving-anagrams-beyond-wordplay</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 14 Sep 2025 02:31:27 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!p_O7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><strong>An anagram is a word or phrase formed by rearranging the letters of another, using every letter exactly once</strong>. For example, cinema can be rearranged to iceman&#8212;both contain the same letters in identical amounts, just ordered differently. Anagrams are more than clever wordplay; they&#8217;re foundational in puzzles, word games, and practical computing problems. Recognizing anagrams is useful in fields like text analysis, cryptography, and natural language processing, where lexical structure and variation matter.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!p_O7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!p_O7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 424w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 848w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 1272w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!p_O7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1182193,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/173099879?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!p_O7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 424w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 848w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 1272w, https://substackcdn.com/image/fetch/$s_!p_O7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe3e7647a-1a9a-46e3-b8a9-3a30800b8175_4608x3456.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h4><strong>The Challenge</strong></h4><p>The task is to write a program that, given two lowercase phrases (with spaces and punctuation removed), determines whether they are anagrams. For instance, <em>listen</em> and <em>silent</em> are anagrams, while <em>hello</em> and <em>world</em> are not. Though often used as a beginner coding exercise, this challenge has real-world relevance in systems that require fast, reliable string comparison&#8212;such as search engines, data deduplication, and secure identifier matching.</p><p><strong>Counting Characters</strong></p><p>Two strings are anagrams if and only if they contain the same characters with the same frequency. For example, both <em>listen</em> and <em>silent</em> contain one <em>l</em>, one <em>i</em>, one <em>s</em>, one <em>t</em>, one <em>e</em>, and one <em>n</em>. If the character counts match for all letters, the strings are anagrams. This principle forms the basis of the program&#8217;s logic: rather than comparing order, it compares composition.</p><p><strong>Algorithm in Three Steps</strong></p><ol><li><p><strong>Length Check</strong>: Return false immediately if the lengths differ, as they cannot be anagrams.</p></li><li><p><strong>Frequency Counting</strong>: Iterate over both strings, counting characters using dictionaries.</p></li><li><p><strong>Comparison</strong>: Compare the two dictionaries; if they match, print "true", else "false".</p></li></ol><p>This approach ensures efficiency and clarity, especially when scaled to multiple test cases.</p><div><hr></div><h4><strong>Python Code</strong></h4><pre><code>def solve_anagrams():
    try:
        print("Anagram Verifier")
        print("----------------")
        print("Enter a number between 1 and 20 for anagram pairs to check")
        t = int(input())  # Number of anagram pairs to check 
        if not (1 &lt;= t &lt;= 20):
            return

        for _ in range(t):
            line = input().split()
            s1, s2 = line[0], line[1]

            # Step 1: Length check
            if len(s1) != len(s2):
                print("false")
                continue

            # Step 2: Build character frequency dictionaries
            count1, count2 = {}, {}
            for char in s1:
                count1[char] = count1.get(char, 0) + 1
            for char in s2:
                count2[char] = count2.get(char, 0) + 1

            # Step 3: Compare dictionaries
            print("true" if count1 == count2 else "false")
    except (IOError, ValueError):
        return

solve_anagrams()
</code></pre><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!8qGP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!8qGP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 424w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 848w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 1272w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!8qGP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png" width="1456" height="355" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:355,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:48089,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/173099879?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!8qGP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 424w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 848w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 1272w, https://substackcdn.com/image/fetch/$s_!8qGP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F900ab1df-4902-4c66-b3a2-1ea5cd1ae638_1460x356.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><div><hr></div><h4><strong>Practical Applications</strong></h4><p>This kind of string comparison has a wide range of uses. In security, anagram logic can help obscure data while preserving its verifiability. In games and puzzles, it&#8217;s a core mechanic for word challenges. In natural language processing, it helps in analyzing lexical variation and stylistic patterns. And in systems design, scrambling identifiers using anagram principles can enhance privacy without losing uniqueness.</p><p>While sorting both strings is also a valid method for checking anagrams, its performance degrades as input size grows. Sorting requires approximately <em>n &#215; log n</em> steps (<em>O(n log n) </em>for those familiar with the Big O notation<em> </em>), where <em>n</em> is the length of the string. Doubling the string length results in more than double the processing time. In contrast, counting character frequencies and comparing them takes only <em>O(n)</em> time, since it involves a single pass through each string. This linear-time approach is significantly faster and more scalable, making it the preferred method for high-performance applications.</p><p>To make the program more robust, it could be extended to handle cases such as Unicode or accented characters, which are common in multilingual datasets. From a performance standpoint, counting characters is generally more efficient than sorting, especially when dealing with long strings or large volumes of comparisons. For a cleaner implementation, Python&#8217;s <em>collections.counter </em>class offers a concise and powerful way to perform frequency-based comparisons with minimal code.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Whole Truths and Half Lies by Rukmini S ]]></title><description><![CDATA[How India Thinks, Acts , Prays, Loves & Marries]]></description><link>https://on.valuecurve.ai/p/whole-truths-and-half-lies-by-rukmini</link><guid isPermaLink="false">https://on.valuecurve.ai/p/whole-truths-and-half-lies-by-rukmini</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Wed, 10 Sep 2025 03:05:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!eqKJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p><em><a href="https://www.dataforindia.com/">Rukmini S</a> is an independent data journalist based in Chennai, India. She is the author of Whole Numbers and Half Truths and publishes Data for India, crafting data-driven stories. Whole Numbers and Half Truths is an attempt to make sense of Indian life through the substrate of data. It explores private and public dimensions of how India thinks, acts, prays, and even loves&#8212;drawing from reliable sources like Lokniti-CSDS, Pew Research, and the Centre for Monitoring the Indian Economy (CMIE) among others.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!eqKJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!eqKJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 424w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 848w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 1272w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!eqKJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png" width="1456" height="820" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:820,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3188923,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:&quot;&quot;,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/173000742?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!eqKJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 424w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 848w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 1272w, https://substackcdn.com/image/fetch/$s_!eqKJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F823d7fee-5cc1-4b0e-a362-747c58292cb6_2366x1332.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div><hr></div><h4><strong>The Problem with Narratives</strong></h4><p>The book&#8217;s premise is : most established narratives about India are fiction masquerading as fact. Despite having one of the world&#8217;s most impressive statistical architectures&#8212;Census enumerators, National Statistical Office surveyors, the National Crime Records Bureau&#8212;Indians continue to build inaccurate narratives about how the country works. They assume vote-banks where none exist, imagine urban explosions with little reason, and decide they are middle class with little basis. These narratives, repeated enough times, become political fodder.</p><p>Numbers, she argues, can capture the nuance and humanity that pre-packaged narratives flatten out.</p><h4><strong>Belief in God, Marriage &amp; Love</strong></h4><p>It&#8217;s claimed 97% of Indians believe in God, with Buddhists as the notable exception. Buddhist traditions do not promote belief in a creator god, focusing instead on ethical conduct and discipline to attain nirvana&#8212;the cessation of suffering. While rituals exist in many Buddhist schools, they are not central to liberation. Religion remains important to Indians: 60% say they pray daily. Education level doesn&#8217;t affect religious practice. Indians are largely indifferent to, and tolerant of, the religious practices of others.</p><p>The book examines patterns in vegetarian vs. non-vegetarian food consumption, and how food habits intersect with upward mobility. It shows that modern India remains largely conservative&#8212;more orthodox in religious practice. It explores dietary choices, alcohol consumption, and the types of alcohol preferred. Foods like poultry, eggs, fish, fruits, and legumes&#8212;often more expensive&#8212;reveal distinct consumption patterns across income groups.</p><p>Indians overwhelmingly marry within their caste. The average age gap between husband and wife is around five years. The poorest 40% of women tend to marry before 18. Richer, better-educated women marry later. Over 90% of marriages in India are arranged by families. Love marriages are more common among richer, better-educated people, and among Christians and Muslims.</p><div><hr></div><h4><strong>Work, Money &amp; Leisure</strong></h4><p>The book explores how Indians earn, how much they earn, and how they spend on leisure. Enjoyment varies by caste, class, gender, and geography. Most women still aren&#8217;t paid for household labor. Nearly 40% of Scheduled Castes work as wage laborers&#8212;mostly in casual roles.</p><p>Riding motorbikes and singing chants are favourite leisure activities. Rich and upper-caste groups spend more time on religious practice and have more access to media. Inter-religious marriages are seen as acts of rebellion.</p><p>Fewer than 20% of women have their names on house papers; half have a bank account in their name; only 10% can make primary purchase decisions. For women working all day, talking to someone is the most meaningful leisure&#8212;because no one listens to them.</p><div><hr></div><h4><strong>Crime Statistics: A Cautionary Tale</strong></h4><p>One of the book&#8217;s most striking revelations concerns crime data. Rukmini&#8217;s investigation of 600 rape cases in Delhi&#8217;s district courts revealed that a significant portion were not sexual assaults as commonly understood, but cases of consenting couples whose families had filed complaints to thwart inter-caste or inter-religious relationships. The FIRs followed informal scripts&#8212;&#8221;moving cars&#8221; that abducted young women, &#8220;sedative-laced cold drinks&#8221; that rendered victims unconscious. None of it stood up in court. The cold drink bottle was the smoking gun that never appeared.</p><p>This isn&#8217;t unique to sexual crime. Her investigation of Mumbai&#8217;s mephedrone cases found that police knowingly used wrong sections of law to arrest pedlars. The result: 100% acquittal rate, with nearly 150 young people spending over a year in jail on charges the police knew were legally unsound.</p><p>The takeaway: India&#8217;s crime statistics begin from a point of significant under-reporting, and states with higher reported crime might actually be doing a better job of ensuring full reporting rather than being the most unsafe.</p><div><hr></div><h4><strong>Human Rights, Media &amp; Voting Preferences</strong></h4><p>There are inferences that media exposure benefits the ruling BJP more than Congress, which performs better among those with low media access. Half of Indians vote based on caste. Many are open to strong leaders and tolerate media bias, discrimination, and segregation based on caste and religion.</p><p>Human rights and freedom of expression aren&#8217;t widely prioritized. From fans to cult followers, the idea of a fair judiciary or honest elections ranks low. Yet 70% still trust their state government. Media continues to reinforce bias. Among urban graduates and postgraduates, 66.6% lean toward ethnic nationalism and majoritarian views. Ghettoisation of minorities&#8212;especially Muslims&#8212;is rampant, from physical segregation to anti-Muslim rhetoric in mass media.</p><p>Voter turnout is around two-thirds, with more women voting than ever. The surge in female voters reflects growing participation. Ironically, the poor vote because it&#8217;s their <em>right</em>, while the non-poor vote for <em>material benefits</em> or out of a <em>sense of duty</em>.</p><p>Indians are on the move&#8212;especially from states like UP, Bihar, and Bengal. Migrants often can&#8217;t vote. In 2015, <a href="http://www.janaagraha.org/">Janagraha</a>, a Bangalore-based trust working in the areas of urban infrastructure and citizenship, reported that 11% of voter addresses couldn&#8217;t be found, 21% had moved, and nearly 50% couldn&#8217;t find their names on Delhi&#8217;s voter list. Pollsters and journalists often frame voting as driven by caste, religion, leaders, or development&#8212;but the reality is more nuanced. After the CAA protests, Muslims voted for AAP (not Congress), even though Congress supported the agitation&#8212;because AAP was more likely to defeat BJP. Notably, BJP has no elected Muslim representative.</p><div><hr></div><h4><strong>The Data Imperative</strong></h4><p>The book&#8217;s conclusion carries urgency. Indian official statistics are not lying to us, Rukmini argues, but they are being silenced. A combination of neglect, discredit, and dismissal makes deficiencies seem too fatal to fix. When inconvenient data is suppressed, the narrative shipped to op-ed pages is that official statistics miss too much, so we&#8217;re better off without them.</p><p>This is dangerous. For 75 years, this data has shaped policy and driven change&#8212;debates about liberalisation, poverty lines, welfare states, affirmative action, rising Islamophobia. Access to this data has empowered ordinary citizens to engage with and agitate against the State. Without it, we hollow out democracy.</p><div><hr></div><p><strong>Final Thoughts</strong></p><p><em>First published in 2021, the datasets referred in some cases may appear dated requiring an update, however the book is an interesting read for those who are curious about modern India&#8212;and essential for anyone who wants to engage critically with the narratives we&#8217;re fed.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share&quot;,&quot;text&quot;:&quot;Share ValueCurve&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.valuecurve.co/?utm_source=substack&amp;utm_medium=email&amp;utm_content=share&amp;action=share"><span>Share ValueCurve</span></a></p>]]></content:encoded></item><item><title><![CDATA[Setting Up Your AI Dev Environment]]></title><description><![CDATA[Visual Studio, Python, Claude CLI & GitHub]]></description><link>https://on.valuecurve.ai/p/setting-up-your-dev-environment</link><guid isPermaLink="false">https://on.valuecurve.ai/p/setting-up-your-dev-environment</guid><dc:creator><![CDATA[Sarfaraz Mulla]]></dc:creator><pubDate>Sun, 07 Sep 2025 05:31:12 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!REm7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Getting started with AI development requires the right tools. This guide walks you through setting up a complete development environment on Mac, perfect for beginners looking to improve their AI fluency.</p><h4>1. Installing Visual Studio Code</h4><p>Visual Studio Code (VSCode) is a powerful, free code editor that's essential for development.</p><p><strong>Download and Install:</strong></p><ol><li><p>Visit <a href="https://code.visualstudio.com/">code.visualstudio.com</a></p></li><li><p>Click "Download for Mac"</p></li><li><p>Open the downloaded <code>.zip</code> file</p></li><li><p>Drag the Visual Studio Code app to your Applications folder</p></li><li><p>Launch VSCode from Applications or Spotlight</p></li></ol><p><strong>First Launch Setup:</strong></p><ul><li><p>VSCode may ask for permissions to access folders - click "Allow"</p></li><li><p>Install the Python extension by Microsoft from the Extensions panel (Cmd+Shift+X)</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!REm7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!REm7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!REm7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!REm7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!REm7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!REm7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1681167,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.valuecurve.co/i/172565957?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!REm7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!REm7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!REm7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!REm7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F771270c2-71eb-4d4b-a563-eb69779bc139_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div></li></ul><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/subscribe?"><span>Subscribe now</span></a></p><h4>2. Installing and Configuring Python</h4><p>Python is the most popular language for AI development, offering extensive libraries and frameworks.</p><p><strong>Install Python using Homebrew:</strong></p><ol><li><p>Open Terminal (press <code>Cmd+Space</code>, type "Terminal", press Enter)</p></li><li><p>First, install Homebrew (if not already installed):</p></li></ol><pre><code><code>/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"</code></code></pre><ol start="3"><li><p>Install Python:</p></li></ol><pre><code><code>brew install python</code></code></pre><ol start="4"><li><p>Verify installation:</p></li></ol><pre><code><code>python3 --version
pip3 --version</code></code></pre><p><strong>Configure Python Environment:</strong></p><ol><li><p>In Terminal, create a virtual environment for your projects:</p></li></ol><pre><code><code>python3 -m venv ~/ai-projects</code></code></pre><ol start="2"><li><p>Activate the environment:</p></li></ol><pre><code><code>source ~/ai-projects/bin/activate</code></code></pre><ol start="3"><li><p>Install essential packages:</p></li></ol><pre><code><code>pip install jupyter pandas numpy matplotlib requests</code></code></pre><p><strong>VSCode Python Setup:</strong></p><ol><li><p>Open VSCode and press <code>Cmd+Shift+P</code></p></li><li><p>Type "Python: Select Interpreter"</p></li><li><p>Choose the interpreter from your virtual environment (<code>~/ai-projects/bin/python</code>)</p></li></ol><p><strong>Configure Shell Profile:</strong> In Terminal, add the virtual environment path to your shell profile for easy activation:</p><pre><code><code>echo 'alias activate-ai="source ~/ai-projects/bin/activate"' &gt;&gt; ~/.zshrc
source ~/.zshrc</code></code></pre><p>Now you can simply type <code>activate-ai</code> to activate your environment.</p><h4>3. Installing and Configuring Claude CLI</h4><p>Claude CLI enables direct interaction with Claude AI from your terminal, streamlining AI-assisted development.</p><p><strong>Installation:</strong></p><ol><li><p>In Terminal, install Node.js (required for Claude CLI):</p></li></ol><pre><code><code>brew install node</code></code></pre><ol start="2"><li><p>Install Claude CLI globally:</p></li></ol><pre><code><code>npm install -g @anthropic-ai/claude-cli</code></code></pre><p><strong>Configuration:</strong></p><ol><li><p>Get your API key from <a href="https://console.anthropic.com/">console.anthropic.com</a></p></li><li><p>In Terminal, configure Claude CLI:</p></li></ol><pre><code><code>claude configure</code></code></pre><ol start="3"><li><p>Enter your API key when prompted</p></li><li><p>Test the installation:</p></li></ol><pre><code><code>claude --version</code></code></pre><p><strong>Basic Usage:</strong></p><ul><li><p>Start a conversation: <code>claude chat</code></p></li><li><p>Get help with code: <code>claude "explain this Python function" &lt; myfile.py</code></p></li><li><p>Generate code: <code>claude "create a Python script to read CSV files"</code></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/p/setting-up-your-dev-environment?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/p/setting-up-your-dev-environment?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></li></ul><h4>4. Configuring VSCode with GitHub</h4><p>GitHub integration enables version control and collaboration, essential skills for any developer.</p><p><strong>Install Git:</strong> In Terminal, run:</p><pre><code><code>brew install git</code></code></pre><p><strong>Configure Git:</strong> Still in Terminal:</p><pre><code><code>git config --global user.name "Your Name"
git config --global user.email "your.email@example.com"</code></code></pre><p><strong>GitHub Authentication:</strong></p><ol><li><p>In Terminal, install GitHub CLI:</p></li></ol><pre><code><code>brew install gh</code></code></pre><ol start="2"><li><p>Authenticate with GitHub:</p></li></ol><pre><code><code>gh auth login</code></code></pre><ol start="3"><li><p>Follow the prompts to authenticate via browser</p></li></ol><p><strong>VSCode GitHub Setup:</strong></p><ol><li><p>Install the "GitHub Pull Requests and Issues" extension</p></li><li><p>Sign in to GitHub when prompted in VSCode</p></li><li><p>Open the Source Control panel (Cmd+Shift+G)</p></li></ol><p><strong>Creating Your First Repository:</strong></p><ol><li><p>Create a new folder for your project</p></li><li><p>Open it in VSCode: <code>File &gt; Open Folder</code></p></li><li><p>Initialize Git using VSCode's integrated terminal (`Ctrl+``):</p></li></ol><pre><code><code>git init</code></code></pre><ol start="4"><li><p>Create a proper <code>.gitignore</code> file in the VSCode terminal:</p></li></ol><pre><code><code># Create .gitignore with common exclusionscat &gt; .gitignore &lt;&lt; EOF# Virtual environmentsvenv/env/.env# Python__pycache__/*.pyc*.pyo.Python# IDEs.vscode/settings.json.DS_Store# API keys and secrets.env.localconfig/secrets.json# Jupyter Notebooks.ipynb_checkpoints/EOF</code></code></pre><ol start="5"><li><p>Create a README file and make your first commit</p></li><li><p>Push to GitHub using the Source Control panel</p></li></ol><p><strong>Environment Variables Setup:</strong> For secure API key management, create a <code>.env</code> file in the VSCode terminal (already in .gitignore):</p><pre><code><code># Create .env file for secrets
echo "ANTHROPIC_API_KEY=your_api_key_here" &gt; .env
</code></code></pre><p><strong>Next Steps</strong></p><p>This environment is optimized for AI development. The combination of VSCode's editing capabilities, Python's AI libraries, Claude CLI's assistance, and GitHub's collaboration tools provides everything needed to begin your AI fluency journey.</p><p>Remember to activate your Python virtual environment (<code>activate-ai</code> or <code>source ~/ai-projects/bin/activate</code>) each time you start a new terminal session for development work. Always keep sensitive information like API keys in <code>.env</code> files that are excluded from version control.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://on.valuecurve.ai/p/setting-up-your-dev-environment/comments&quot;,&quot;text&quot;:&quot;Leave a comment&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://on.valuecurve.ai/p/setting-up-your-dev-environment/comments"><span>Leave a comment</span></a></p>]]></content:encoded></item></channel></rss>