The widely reported increase in the frequency of high impact, low probability extreme weather events pose significant challenges to electric power system's resilient operation. This dissertation research explores strategies to enhance operational resilience that addresses the distribution network's ability to adapt to the changing operating conditions. We introduce a novel Dual Agent-Based framework for optimizing the scheduling of distributed energy resources (DERs) within a networked microgrid (N-MG) using the deep reinforcement learning (DRL) paradigm. This framework aims to minimize operational and environmental costs during normal operations while enhancing critical load supply indices (CSI) under emergency conditions. Additionally, we introduce a multi-temporal dynamic reward shaping structure along with the incorporation of an error coefficient to enhance the learning process of the agents. To appropriately manage loads during emergencies, we propose a load flexibility classification system that categorizes loads based on its criticality index. The scalability of the proposed approach is demonstrated through running multiple case-studies on a modified IEEE 123-node benchmark distribution network. We also test the proposed method with different DRL algorithms to demonstrate its compatibility and ease of application. We compared the results with the traditional metaheuristic algorithms namely particle swarm optimization (PSO) and genetic algorithm (GA). To gain a deeper understanding of the developed model, we conducted a sensitivity study. The key findings from this study align with the mathematical foundation of the approach outlined in this dissertation, providing further support.