Gestalt IT

Gestalt IT

The On-Premise IT Roundtable gathers the best independent enterprise IT voices, puts them around a table, and gets them talking around a single topic. It breaks the traditional IT silos, taking on topics from across the isolated realms of servers, networking, storage, cloud, and mobility.

  • Wi-Fi 7 Isn’t Enough For Future Wireless Needs

    New technology standards can’t anticipate how users will consume content and applications and revisions to the standards will be adopted to meet their needs. In this episode, Tom Hollingsworth is joined by Ron Westfall, Drew Lentz, and Rocky Gregory as they discuss where Wi-Fi 7 falls short. Even though Wi-Fi 7 is a new standard it is still based on older thinking and users have changed the way they consume content and applications. This episode discusses the difference between cloud-hosted applications and local software as well as the drive to increase performance on edge access points to include faster response times to things like AI assistants.

    Mobility Field Day 11 Pre-Event Podcast – The Tech Field Day Podcast

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    Wi-Fi 7 Isn’t Enough For Future Wireless Needs

    When the development of Wi-Fi 7 started, the promise of 5G speeds was simply theoretical. We were still struggling with the rollout of LTE and given how problematic 3G was before it Wi-Fi just made sense. Fast forward to the modern wireless era and you find that 5G connectivity is not only more stable but, in many cases, much faster than the networks you can connect to a the local coffee shop. Add in the protection mechanisms inherent in cellular technology and it appears to be a significantly better user experience.

    Users have also changed the way they do work. Before the pandemic the majority of work was done with applications that connected to internal resources at a company office. You needed private wireless connectivity to access important resources. The cloud was making changes but users still felt comfortable working at their desks. Five years later and most work is done through applications that connect to cloud resources. There isn’t as much of a need to go to the office, and that’s if you even still have one. Users don’t need fast enterprise connectivity. They just need to get to the cloud somehow.

    The third major factor in the lack of performance for Wi-Fi 7 is the rise of more intensive applications. Modern AI development sees a significant push to have processing done centrally and algorithms being run on the edge. We need to have more powerful devices on the edge to take advantage of those capabilities, but the trend previously in development was to use more modest devices to meet power budgets for edge switches to deliver standard power capabilities. While power standards have increased to allow for more powerful capabilities the older style of thinking still persists.

    This episode debates these topics as well as others to help you understand what the current state of Wi-Fi 7 is and how it will help you and your users with their connectivity needs.

    Podcast Information:

    Tom Hollingsworth is a Networking and Security Specialist at Gestalt IT and Event Lead for Tech Field Day. You can connect with Tom on LinkedIn and X/Twitter. Find out more on his blog or on the Tech Field Day website.

    Ron Westfall is The Research Director at The Futurum Group specializing in Digital Transformation, 5G, AI, Security, Cloud Computing, IoT and Data Center as well as the host of 5G Factor Webcast. You can connect with Ron on LinkedIn and on X/Twitter and see his work on The Futurum Group’s website.

    Rocky Gregory is a Technical Leader in mobility and Wi-Fi and a long time Field Day Delegate. You can connect with Rocky on LinkedIn and on X/Twitter and learn more on his website.

    Drew Lentz is the Co-Founder of Frontera Consulting and the host of Waves Podcast. You can connect with Drew on LinkedIn and on X/Twitter and learn more on his website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Wi-Fi 7 Isn’t Enough For Future Wireless Needs

    14 May 2024, 2:00 pm
  • Data Quality is More Important Than Ever in an AI World with Qlik

    In our AI-dominated world, data quality is the key to building useful tools. This episode of the Tech Field Day podcast features Drew Clarke from Qlik discussing best practiced for integrating data sources with AI models with Joey D’Antoni, Gina Rosenthal, and Stephen Foskett before Qlik Connect in Orlando. Although there is a lot of hype about AI in industry, companies are realizing the risks of generative AI and large language models as well. Solid data practices in terms of data hygiene, proven data models, business intelligence, and flows can ensure that the output of an AI application is correct. The proliferation of Generative AI is also causing a rapid increase in the cost and environmental impact IT systems and this will impact the success of this technology. Good data practices can help, allowing a lighter and less expensive LLM to produce quality results. The Tech Field Day delegates will learn more about these topics at Qlik Connect in Orlando, and we will be recording and sharing content as well.

    This Spotlight Podcast is Brought to You by Qlik

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    As AI technologies like generative AI and large language models continue to appear, the foundation upon which these technologies are built – data – becomes the linchpin of their success. This episode of the Tech Field Day podcast features a discussion with Drew Clarke from Qlik, alongside industry experts Joey D’Antoni, Gina Rosenthal, and Stephen Foskett. Ahead of Qlik Connect in Orlando, the panel discussed the best practices for integrating data sources with AI models, underlining the importance of data quality in an AI-dominated world.

    The proliferation of AI technologies has brought with it an increased awareness of the potential risks associated with generative AI and LLMs. As companies venture into the realm of AI, the realization that not all AI is capable of delivering accurate or useful outcomes has become apparent. This acknowledgment has brought traditional data practices such as data hygiene, data quality, proven data models, business intelligence, and data flows into the spotlight. These practices ensure that the output of an AI application is correct and reliable.

    One of the critical challenges is the integration of data into LLMs and small language models. We consider metadata, data security, and the implications of regulations like GDPR and the California Data Privacy Act on data integration with AI models. It is critical to consider data privacy and to avoid exposing private data as companies integrating data into their AI models.

    We should also consider societal and environmental impacts of the rapid increase in the use of AI as well as the cost of inferencing. The environmental footprint of data centers, driven by the energy and water consumption required to support AI computations, is a particular area of concern. This underscores the need for good data practices that not only ensure the quality of AI outputs but also contribute to the sustainability of AI technologies.

    Data is a key product in an AI world, and we must treat data with the same care and consideration as we do in conventional applications. This involves curating, managing, and continuously improving data to ensure its quality and relevance. Data engineers and business analysts play a key role in enhancing productivity and effectiveness of AI capabilities.

    This discussion is a reminder of the critical importance of data quality in the age of AI. As companies navigate the complexities of integrating AI into their operations, the foundational principles of data hygiene, data quality, and proven data models remain as relevant as ever. We look forward to discussing these themes at Qlik Connect in June, and invite our audience to attend the event!

    Visit the Qlik Connect official website for more information and registration.

    Podcast Information:

    Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

    Gina Rosenthal, Founder and CEO of Digital Sunshine Solutions. You can connect with Gina on LinkedIn and listen to her podcast, The Tech Aunties PodcastLearn more on her website.

    Joey D’Antoni is a Principal Consultant at Denny Cherry & Associates Consulting. You can connect with Joey on LinkedIn, on Mastodon, and on X/Twitter or read more about him and his work on his website.

    Drew Clarke is the General Manager & EVP of the Data Business Unit at Qlik. You can connect with Drew on LinkedIn and learn more about Qlik by visiting their website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Data Quality is More Important Than Ever in an AI World with Qlik

    7 May 2024, 2:00 pm
  • Containerization is Required to Modernize Applications at the Edge

    Modern applications are widely deployed in the cloud, but they’re coming at the edge as well. This episode of the Tech Field Day podcast features Alastair Cooke and Paul Nashawaty from The Futurum Group, Erik Nordmark from ZEDEDA, and host Stephen Foskett discussing the intersection of application modernization and edge computing. As enterprises look to deploy more applications at the edge they are leveraging technologies like Kubernetes and containers to enable portability, scalability, resilience, and high availability. In many cases customers are moving existing web applications to the edge to improve performance and security, but not all webscale technologies are appropriate on the limited hardware, environmentals, and connectivity found at the edge. The question is whether to improve the edge compute platform or build resiliency into the application itself. But there are limits to this approach, since edge locations don’t have the elasticity of the cloud and many of the features of Kubernetes were not designed for limited resources. It comes down to developer expectations, since they are now accustomed to the experience of modern webscale platforms and expect this environment everywhere. In the future, we expect WASM, LLMs, and more to be used regardless of location.

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    The modernization of applications, from datacenter to cloud to edge, is rapidly progressing. Technologies drawn from the hyperscale world are finding their way to edge locations, where data processing and analysis occur closer to the source of data and customer transactions. This shift is driven by the need for real-time processing, reduced latency, and enhanced security, and technologies like Kubernetes and containers are increasingly used to facilitate this transition.

    The Benefits of Containerized Applications at the Edge

    Containerization offers many benefits essential for modern applications, especially those deployed at the edge. It provides a level of portability that allows applications to be easily moved and managed across different environments, from the cloud to the edge, without the need for extensive reconfiguration or adaptation. This is particularly important given the diverse and often resource-constrained nature of edge environments, which can vary greatly in terms of hardware, connectivity, and operational conditions.

    Scalability is another critical aspect of containerization that aligns well with the needs of edge computing. Containers enable applications to be decomposed into microservices, allowing for more granular scaling and management. This microservices architecture facilitates the efficient use of resources, enabling applications to scale up or down based on demand, which is particularly useful in edge environments, again in the face of resource constraints.

    Resilience and high availability are further enhanced through containerization. By deploying applications as a set of interdependent but isolated containers, developers can achieve a level of redundancy and fault tolerance that is difficult to achieve with monolithic architectures. This is crucial at the edge, where the risk of hardware failure, network disruptions, and other environmental factors can pose significant challenges to application availability and reliability.

    The security benefits of containerization should not be overlooked in the context of edge computing either. Containers provide a level of isolation that helps mitigate the risk of cross-application interference and potential security breaches. This isolation is complemented by the ability to apply granular security policies at the container level, enhancing the overall security posture of edge deployments. And containerized applications are easier to keep up to date as security patches are developed.

    Challenges for Modern Applications at the Edge

    Despite these advantages, the deployment of containerized applications at the edge is not without its challenges. The resource limitations of edge environments, including constraints on compute power, storage, and network bandwidth, require careful consideration of the containerization strategy employed. Additionally, the management and orchestration of containers at the edge introduce complexity, particularly in highly distributed environments with potentially thousands of edge locations.

    The choice between improving the edge compute platform to better support containerization and building resilience into the application itself is a critical decision. While enhancing the edge platform can provide a more robust foundation for containerized applications, it may require significant financial and technological investment. Although designing applications with inherent resilience and adaptability can offer a more immediate solution, these may not achieve all of the benefits of containerization.

    The expectations of developers, accustomed to the rich features and flexibility of modern cloud-native platforms, also play a significant role in the adoption of containerization at the edge. Developers seek environments that offer the same level of agility, ease of use, and comprehensive tooling they are familiar with in the cloud, driving the demand for containerization technologies that can replicate this experience at the edge.

    Looking forward, the evolution of containerization at the edge is likely to be influenced by emerging technologies such as WebAssembly (WASM) and large language models (LLMs). WASM promises to enhance the portability and efficiency of applications across diverse computing environments, including the edge, by enabling more lightweight and adaptable application architectures. The integration of AI and machine learning capabilities, particularly for processing and analyzing data at the edge, will further drive the modernization of applications in these distributed environments.

    Containerization and the Edge

    Containerization is a fundamental enabler for the modernization of applications in the cloud, and this is true at the edge as well. It offers the portability, scalability, resilience, and security necessary to address the unique challenges of edge computing, while also meeting the expectations of developers for a modern application development environment. As enterprises continue to push the boundaries of what is possible at the edge, containerization will play a pivotal role in shaping the future of edge computing, driving innovation and enabling new levels of performance and efficiency.

    Podcast Information:

    Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

    Alastair Cooke is a CTO Advisor at The Futurum Group. You can connect with Alastair on LinkedIn or on X/Twitter and you can read more of his research notes and insights on The Futurum Group’s website.

    Paul Nashawaty is a Practice Lead focused on Application Development Modernization at The Futurum Group. You can connect with Paul on LinkedIn and learn more about his research and analysis on The Futurum Group’s website.

    Erik Nordmark is the CTO and Co-founder at ZEDEDA. You can connect with Erik on LinkedIn and learn more about ZEDEDA on their website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Containerization is Required to Modernize Applications at the Edge

    30 April 2024, 2:00 pm
  • Security Audits Cause More Harm Than Good

    Security audits are painful and often required for compliance but they aren’t adversarial unless you have a bad auditor or bad policy compliance. In this episode, Tom Hollingsworth sits down with Teren Bryson, Skye Fugate, and Ben Story to discuss the nuances of audits. The panel discusses the discovery of technical debt, external versus internal auditing, the need for flexibility in procedures. and how good auditors can make for a more positive outcomes.

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    Thorough audits will uncover issues with compliance as well as technical debt. This could include older devices that should have been replaced at the end of their life. It could also find code versions that are vulnerable to exploits and could lead to more issues. While operations teams don’t like being told things aren’t as they should be it’s better to know about those problems early before they get out of control.

    It is also important to understand that there are different reasons to have an audit. The most common perception is that external organizations are auditing your enterprise to comply with their polices and procedures, such as a partnership or acquisition. However, internal audits carried out by third parties to verify compliance with your own polices are much more frequent. How can you ensure that you are doing what you say you’re doing if you don’t have someone else take a look at your polices to ensure they’re being followed? This is also the place where you find issues with user compliance, such as executives that believe the rules don’t apply to them.

    A good auditor can make the difference in your audit experience. The best auditors are knowledgeable in the subject area and understand what is needed for compliance. They also ensure that you have time to remediate the issues. A bad auditor is one that only follows the strict procedures and doesn’t understand the nuance in auditing. They are often perceived as adversarial and cause IT teams to dread audits.

    If you want to have a good audit experience you should keep two things in mind. The first is that you should assume that it will be a positive experience. The auditors are doing a job and they aren’t trying to hurt you or your company. The second thing to keep in mind is to answer the questions asked without volunteering information. You can innocently offer additional information to a question that leads to a negative experience because it forces the auditor to uncover things they weren’t originally tasked to find.

    Podcast Information:

    Tom Hollingsworth is a Networking and Security Specialist at Gestalt IT and Event Lead for Tech Field Day. You can connect with Tom on LinkedIn and X/Twitter. Find out more on his blog or on the Tech Field Day website.

    Ben Story is a Network and Cybersecurity Engineer and Field Day veteran. You can connect with Ben on LinkedIn or on X/Twitter and read more on his personal website.

    Skye Fugate is a dedicated cybersecurity expert. You can connect with Skye on LinkedIn or on X/Twitter and learn more about his work on his website.

    Teren Bryson is a Director of Engineering and Operations. You can connect with Teren on LinkedIn or on X/Twitter. Learn more about Teren on his website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Security Audits Cause More Harm Than Good

    23 April 2024, 2:00 pm
  • AI is Smarter Than Your Average Network Engineer

    Recent advances in AI for IT have shown the huge potential for changing the way that we do work. However, AI can’t replace everyone in the workforce. In this episode, Tom Hollingsworth is joined by Rita Younger, Josh Warcop, and Rob Coote as they look at how the hype surrounding AI must inevitably be reconciled with the reality of real people doing work. They discuss the way that AI is judged for its mistakes versus a human as well as how marketing is pushing software as the solution to all our staffing ills.

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    AI will change the way that IT teams configure and manage their systems but it will take time for those teams to integrate into their current workflows and assure everyone it is a boon. This episode features Rita Younger, Joshua Warcop, and Rob Coote talking to Tom Hollingsworth about how much AI has already changed and how far it has to go in order to be a fully featured solution. They discuss not only the gaps in AI but the gaps that knowledge workers embody today.

    While AI is drawing from a depth of knowledge that encompasses documentation and best practices, it is not a perfect solution today. There is nuance in the discussion that comes from years of experience in a given discipline. People learn from their mistakes and we expect them to make those mistakes in their growth as a worker. When AI makes mistakes we are immediately skeptical and racing to find a way to prevent it from ever happening again. Should AI be given the same grace that we extend to people?

    Another part of this discussion highlights how AI is touted to replace so many things in IT yet we’ve heard this hype before and yet no one has been completely put out of work. Every technology eventually finds a niche to fill and performs to capabilities instead of the overinflated promise that was used to market it. As a technology matures, operations and design teams find the optimal way to use any solution instead of accepting the idea that it will replace everything.

    The discussion wraps up with ideas from the panel about what questions you should be asking today when it comes to AI. The reality is that we will need to incorporate this technology into our workflows but we need to verify that it’s going to be a help as we train it to do the things we want it to do for us.

    Podcast Information:

    Tom Hollingsworth is a Networking and Security Specialist at Gestalt IT and Event Lead for Tech Field Day. You can connect with Tom on LinkedIn and X/Twitter. Find out more on his blog or on the Tech Field Day website.

    Josh Warcop is a Cloud Innovation Architect. You can connect with Josh on LinkedIn and on X/Twitter and learn more about him on his website.

    Rita Younger is a Practice Lead of Data Center Networking at World Wide Technology. You can connect with her on LinkedIn or on X/Twitter and read more on her website.

    Rob Coote is a Cybersecurity and Infrastructure Director. You can connect with Rob on LinkedIn or on X/Twitter and learn more on his website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: AI is Smarter Than Your Average Network Engineer

    16 April 2024, 2:00 pm
  • Cyber Resiliency is Just Data Protection

    Cyber Resiliency is a term that encompasses much more than simply protecting data. This episode features Tom Hollingsworth joined by Krista Macomber and Max Mortillaro discussing the additional features in a cyber resiliency solution and the need to understand how data needs to be safeguarded from destruction or exploitation. The episode highlights the shift from reactive to proactive measures as well as the additional integrations that are needed between development, deployment, and operations teams to ensure success.

    Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

    The panel talks about how backup and recovery have always been seen as reactive measure to disaster and not an integrated piece of a more proactive solution for other outage causes. Only when security incidents became more impactful and caused more data loss or theft did the need arise for more protective measures, such as entropy detection of data corruption or immutability of stored copies.

    However, truly resilient solutions need more than just technical features. Other necessary pieces like policy-based enforcement of data retention and recovery objectives are crucial. So too is the need for security measures that prevent critical system processes from being exploited to achieve attacker goals. Operations teams must be involved in the entire process to keep users online with clean data while also allowing incident response teams to investigate and eliminate points of intrusion and data corruption and loss.

    The episode wraps up with important questions that need to be answered when investigating solutions. Just because someone tells you that it’s resilient should you believe their claims. By asking good questions about the capabilities of the system in the investigation phase, you should find yourself with a usable system to prevent data loss and ensure business continuity in the future.

    Podcast Information:

    Tom Hollingsworth is a Networking and Security Specialist at Gestalt IT and Event Lead for Tech Field Day. You can connect with Tom on LinkedIn and X/Twitter. Find out more on his blog or on the Tech Field Day website.

    Krista Macomber is a Research Director at The Futurum Group. You can connect with Krista on LinkedIn and on X/Twitter and read more of her research and insights on The Futurum Group’s website.

    Max Mortillaro is an Industry Analyst & Partner at TECHunplugged. You can connect with Max on LinkedIn and on X/Twitter and learn more on the TECHunplugged website and blog.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Cyber Resiliency is Just Data Protection

    9 April 2024, 2:00 pm
  • Credible Content From the Community is More Important than Ever

    There is a hazardous amount of AI-generated and SEO-oriented content being generated, and the solution is real stories from real communities. In the first episode of Tech Field Podcast, recorded on-site at AI Field Day, Stephen Foskett chats with Frederic Van Haren, Gina Rosenthal and Colleen Coll about confronting inauthentic content. The internet is inundated with low-quality, AI-generated, and SEO-driven content, and the antidote is the cultivation of real, credible voices within the tech community. The discussion focuses on the importance of community-driven content and the credibility of individual voices in an era dominated by content optimized for algorithms rather than human engagement. The rise of generative AI in content creation and consumption is accelerating, and we must all find a balance between technological advancements and human insight. This is the essence of the Tech Field Day experience, which fosters meaningful dialogue among tech professionals and companies in the industry. For fifteen years Tech Field Day has highlighted the critical role of human connection and credible voices in navigating the digital information landscape, and this re-launched podcast is part of that continuing effort.

    The entire internet is saturated with AI-generated content and SEO-driven articles, and tech media is no exception. Gestalt IT was founded to highlight the value of genuine community engagement and the power of independent voices. That’s why we started Tech Field Day in 2009, to give these independent technical experts a platform to learn, share, and explore enterprise IT. Today we are re-launching the Tech Field Day podcast to return to this foundational ethos but also rise to the challenge to provide credible, human-centered narratives in tech media.

    The internet is the foundation for our industry, but this same technology threatens to undermine the fabric of genuine voices. The proliferation of content optimized for algorithms rather than humans has diluted the quality of information, leaving readers navigating a maze of inauthenticity and downright falsehood. That’s why community-driven content and the credibility of individual voices is so important. It’s a challenge that the Tech Field Day podcast is being re-launched to address head-on.

    Tech Field Day is designed to be an environment where technology professionals and companies can engage in meaningful dialogue. This engagement is built on their authentic voices, and we have always tried to bring a diverse array of insights and experiences to the table. Like the event series, this podcast is designed to serve as a platform for these voices to cut through the noise of generative AI and SEO manipulation, offering perspectives rooted in real-world experiences and knowledge.

    This first episode focuses on AI’s impact on society and the authenticity crisis in content creation. How can we build human connection when so much is automated? And yet we are not anti-AI: The challenge is how to leverage its capabilities while always ensuring that it complements rather than replaces our authentic voices. We are reminded of the importance of critical thinking and the value of community in navigating the flood of information. We need to find a balance between technological advancement and maintaining the integrity of human expression.

    Inauthentic content can never match the feeling of a real discussion among passionate and knowledgeable experts. Tech Field Day is dedicated to fostering open discussion and we urge the tech community to rally around the principles of authenticity and credibility. SEO-optimized spam can never drown out real people as long as we keep questioning, discussing, and sharing. And we must embrace the changing social media landscape to help bring these conversations to the world through Tech Field Day events, this podcast, and our individual platforms.

    Podcast Information:

    Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

    Frederic Van Haren is the CTO and Founder at HighFens Inc., Consultancy & ServicesConnect with Frederic on LinkedIn or on X/Twitter and check out the HighFens website

    Gina Rosenthal, Founder and CEO of Digital Sunshine Solutions. You can connect with Gina on LinkedIn and listen to her podcast, The Tech Aunties PodcastLearn more on her website.

    Colleen Coll specializes in Global Events Management and Digital Media Operations. You can connect with Colleen on Linkedin or on X/Twitter and learn more about her on her website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTubeApple PodcastsSpotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Credible Content From the Community is More Important than Ever

    2 April 2024, 2:00 pm
  • Reintroducing the Tech Field Day Podcast

    We are once again returning to the Tech Field Day name for our weekly podcast. In this episode, Stephen Foskett and Tom Hollingsworth delve into the history of the podcast, how it came to prominence and what sets it apart from other technical podcasts. We also discuss why each episode has a premise and why the name has been the On-Premise IT Podcast for so long.

    Why Now?

    We’re changing things up around here! Don’t worry, the only thing that is going to be different is the name of the podcast. We’re going back to the old name of Tech Field Day Podcast as a way to highlight what makes the podcast unique in the industry. Long time listeners of the show may remember it used to be the Gestalt IT Tech Field Day Roundtable over a decade ago.

    Since then we’ve changed a lot about the format and content. Since 2017 we’ve been known as the On-Premise IT Podcast. It focuses on a specific topic, a premise if you will, each episode and features 3-4 Tech Field Day delegates as guests. We’ve posted 322 episodes in the past seven years talking about all aspects of enterprise technology both new and old. We’ve even focused on some non-tech issues like burnout and career growth. It’s all been for the betterment of the community at large as we bring you the opinions and perspectives of a group of experts in the enterprise IT space.

    We wanted to make sure to highlight the relationship between Tech Field Day and the podcast as we move forward. The delegates at a Field Day event represent the critical voice of the practitioner and give the episodes a sense of grounded realism. This isn’t a marketing exercise or wishful thinking. These are the people that do the things and tell everyone what works and what doesn’t. They are the ones qualified to inform decision makers about the promise as well as the pitfalls.

    Future episodes of our podcast will appear on the Tech Field Day site as well as through Spotify for Podcasters. We will still publish our new episodes every Tuesday so make sure you subscribe in your favorite podcatcher so you don’t miss a single premise that our wonderful delegates come up with each episode. Don’t forget to leave comments on the episodes so we know what you think. You can also leave a rating or a review in your podcatcher for others to discover the Tech Field Day Podcast.

    Podcast Information:

    Stephen Foskett is the Publisher of Gestalt IT and Organizer of Tech Field Day, now part of The Futurum Group. Find Stephen’s writing at Gestalt IT and connect with him on LinkedIn or on X/Twitter.

    Tom Hollingsworth is a Networking and Security Specialist at Gestalt IT and Event Lead for Tech Field Day. You can connect with Tom on LinkedIn and X/Twitter. Find out more on his blog or on the Tech Field Day website.

    Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube, Apple Podcasts, Spotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website

    © Gestalt IT, LLC for Gestalt IT: Reintroducing the Tech Field Day Podcast

    26 March 2024, 2:00 pm
  • AI Demands a New Storage Architecture with Hammerspace

    Hammerspace unveiled a new storage architecture called Hyperscale NAS that addresses the needs of AI and GPU computing. This episode of the On-Premise IT podcast, sponsored by Hammerspace, is focused on the extreme requirements of high-performance multi-node computing. Eric Bassier of Hammerspace joins Chris Grundemann, Frederic Van Haren, and Stephen Foskett to consider the characteristics that define this new storage architecture. Hammerspace leverages parallel NFS and flexible file layout (FlexFiles) within the NFS protocol to deliver unprecedented scalability and performance. AI training requires scalability, performance, and low latency but also flexible and robust data management, which makes Hyperscale NAS extremely attractive. Now that the Linux kernel includes NFS v4.2, the Hammerspace Hyperscale NAS system works out of the box with standards-based clients rather than requiring a proprietary client. Hammerspace is currently deployed in massive hyperscale datacenters and is used in some of the largest AI training scenarios.

    Combining Simplicity with Speed, with the New Hammerspace Hyperscale NAS Architecture

    Data is the new currency of the modern economy. It has opened huge opportunities to drive trailblazing technologies like AI and machine learning deep into businesses and industries. But as storage systems lay jammed with volumes of unstructured data, legacy solutions are under threat. Data overabundance can easily overwhelm and disrupt these known storage solutions, leaving organizations at risk of being outperformed by their rivals.

    This episode of On-Premise IT Podcast brought to you by Hammerspace, explores the reasons why the new data cycle requires next-generation storage systems. Eric Bassier, Sr Director of Solution Marketing for Hammerspace, talks about a new NAS architecture that can accommodate all the data that’s heading enterprises’ way, and do it at the speed require for AI training.

    A Change Is in Order

    “AI is forcing a reckoning in the industry that’s probably long overdue, to change how data is used and preserved,” comments Bassier.

    Bassier puts storage systems into two main categories – the traditional scale-out network-attached storage (NAS), a technology already well-known and widely deployed in organizations, and the relatively new HPC parallel file systems designed exclusively for HPC environments.

    “The fact that the HPC file systems have never been widely deployed in the enterprise speaks to a gap there. They don’t have the right feature set, and are too difficult to maintain,” says Bassier.

    This is also telling of an uncomfortable truth about NAS systems. “The fact that HPC file systems still exist so predominantly in HPC environments is an admission that scale-out NAS architectures don’t meet their performance demands.”

    What fundamentally separates HPC and AI workloads from traditional workloads is the need for speed and performance. GPU farms for AI training require to access data concurrently at high speeds.

    A Disruptive Hyperscale NAS Architecture

    Hammerspace has a new architecture, the Hyperscale NAS, that supports colossal data capacity and performance demands of GPU farms.

    “[ the architecture] largely came out of our work with one of the world’s largest hyperscalers for their large language model training environment. It is a new storage architecture that as more and more enterprises get into AI and drive forward their initiatives, this would be the best storage architecture for large language model training, generative AI training, and other forms of deep learning,” says Bassier.

    The unnamed client has a thousand-node Hammerspace storage cluster deployed in their LLM training environment where more than 30,000 GPUs are at work across 4000 server nodes.

    “The Hammerspace storage cluster is feeding those GPUs at an aggregate performance of around 100 Terabits per second. It’s 80 to 90% of line rate,” he says.

    Performance aside, the reason why the client chose Hyperscale NAS for the job is its standards-based design. Hyperscale NAS is standards-based, meaning it can operate on any commercial off-the-shelf storage server, be it NAS, object or block. One of the major benefits of that is, by just sitting on top of the storage, Hyperscale NAS can accelerate the underlying system without needing a costly upgrade.

    “The underpinnings of this architecture have been in Hammerspace since day one.” Bassier points to the origin of the name “Hammerspace” to underline this. A hammerspace, he explains, is an extradimensional space invisible to the eye. Characters in movies and cartoons often use it to store unusually large objects, which they summon in times of need making it looks like they are conjured out of thin air. Think of Hermione Granger’s beaded handbag in Harry Potter, or Mary Poppins’ carpet bag.

    Chris Grundemann comments, “Hyperscale NAS appears at first blush to be a representation of that. There’s no proprietary client software needed. It just works as a NAS but in a really new way, to support these crazy GPU workloads in AI.”

    So, why did Hammerspace wait so long to introduce it? “We are bringing it to market now because of everything we’ve learned, where we’ve now proven this architecture at hyperscale,” says Bassier.

    The paradigm is fast evolving. HPC and AI/ML workloads are going to be pervasive across organizations, and they will need a new NAS architecture that provides both the performance of HPC file systems with the right feature set, and the standards-based simplicity of Network File System (NFS).

    Tying Together the Best of Both Solutions

    In a scale-out NAS architecture, data has to make multiple network hops between the client and server. The more the hops, the higher the latency of transmission. The Hyperscale NAS architecture opens a direct data path between the two points, reducing the number of transmissions and retransmissions. The result is lower latency and faster throughput.

    Metadata is dealt out-of-band. “We offload a lot of the metadata operations to a separate path so we can streamline it.”

    Hyperscale NAS detaches data from metadata, putting them into two separate planes – the data plane and the control plane. The metadata resides inside the metadata service nodes which are essentially queryable databases.

    This ties into another key aspect of the Hyperscale NAS architecture that Bassier highlights. Oftentimes, file systems are trapped in the storage layer that makes data opaque to the users. This is a barrier to collaboration works.

    Hammerspace lifts the file system out of the storage layer and creates a global parallel file system with a single global namespace. Datasets are assimilated from multiple sources across sites and storage silos, and deposited into this file system. With global data orchestration, transparency is ensured for all users.

    “Even users that are remote or not co-located with the data are all presented the same files that they’re authorized to see.”

    Hyperscale NAS leverages NFS v4.2 client, particularly its two optional capabilities – parallel NFS and FlexFiles. “Hammerspace is the first one to take advantage of those capabilities,” says Bassier.

    If Hyperscale NAS sounds a lot like an HPC parallel file system to you, then it is worth nothing that there are significant differences. Where others solutions rely on proprietary file system clients or agents that sit on GPU servers to give them the intelligence, Hammerspace doesn’t, and works with all standards-based clients, he concludes.

    To learn more, visit Hammerspace’s website. Also check out Hammerspace’s presentation from the recent AI Field Day event to get a deep dive of the architecture. Other interesting content to check out are Alastair Cooke’s article, and Keith Townsend’s writeup on Hyperscale NAS.

    Podcast Information:

    Gestalt IT and Tech Field Day are now part of The Futurum Group.

    Follow us on Twitter! AND SUBSCRIBE to our newsletter for more great coverage right in your inbox.

    © Gestalt IT, LLC for Gestalt IT: AI Demands a New Storage Architecture with Hammerspace

    19 March 2024, 2:00 pm
  • No One Wants To Be A Network Engineer Any More

    The job market is more competitive than ever but the desire to fill network engineering roles is lower than before. In this episode, Tom Hollingsworth is joined by Ryan Lambert, Dakota Snow, and David Varnum for an examination of why network design and implementation isn’t a hot career path. They look at the rise of cloud as a discipline as well as the reduction of complexity in modern roles with help from software an automation shifts. They also discuss how entry level professionals can adjust their thinking to take advantage of open roles on the market.

    Podcast Information:

    Gestalt IT and Tech Field Day are now part of The Futurum Group.

    Follow us on Twitter! AND SUBSCRIBE to our newsletter for more great coverage right in your inbox.

    © Gestalt IT, LLC for Gestalt IT: No One Wants To Be A Network Engineer Any More

    12 March 2024, 2:00 pm
  • 31 minutes 45 seconds
    Real World AI Looks a Lot Different From the Movies

    Most people envision AI as a cool and orderly datacenter activity, but this technology will soon be everywhere. This episode of the On-Premise IT podcast contrasts the AI-based greenhouses of Nature Fresh Farms, as presented by guest Keith Bradley at AI Field Day, with the massive GPU-bound infrastructure many people imagine. Allyson Klein, Frederic Van Haren, and Stephen Foskett attended AI Field Day and were intrigued by the ways AI can process data from cameras and other sensors in a greenhouse environment.

    Podcast Information:

    Gestalt IT and Tech Field Day are now part of The Futurum Group.

    Follow us on Twitter! AND SUBSCRIBE to our newsletter for more great coverage right in your inbox.

    © Gestalt IT, LLC for Gestalt IT: Real World AI Looks a Lot Different From the Movies

    5 March 2024, 3:00 pm
  • More Episodes? Get the App
© MoonFM 2024. All rights reserved.