Audio
CKMS 102.7 FM: Radio Waterloo
What Waterloo Region’s home buyers, sellers and renters need to know going in to 2025
CKMS 102.7 FM: Radio Waterloo
Radio Nowhere Episode 98 Dylan 1, 1/18/25
CKMS 102.7 FM: Radio Waterloo
So Old It’s New set for Monday, January 20, 2025
CIGI/The Logic Big Tech Podcast
Questions About AI? We Want to Hear Them
We spend a lot of time talking about AI on this show: how we should govern it, the ideologies of the people making it, and the ways it's reshaping our lives.
But before we barrel into a year where I think AI will be everywhere, we thought this might be a good moment to step back and ask an important question: what exactly is AI?
On our next episode, we'll be joined by Derek Ruths, a Professor of Computer Science at McGill University.
And he's given me permission to ask him anything and everything about AI.
If you have questions about AI, or how its impacting your life, we want to hear them. Send an email or a voice recording to: machineslikeus@paradigms.tech
Thanks – and we’ll see you next Tuesday!
CKMS 102.7 FM: Radio Waterloo
New Music Added to Libretime + Horizon Broadening Hour #62
Dan Nedelko
Struggling to grow your business? Clients driving you insane? #business
Dan Nedelko
Mobile + Google Business Profiles = conversion magic for small businesses.
Cordial Catholic, K Albert Little
Former Evangelical Pastor Explains the BIBLICAL Roots of Pope Francis' Jubilee (w/ Dr. John Bergsma)
CKMS 102.7 FM: Radio Waterloo
Reader’s Delight, Episode 14
Debt Free in 30 Minutes
542 – The Truth of Credit Card Rewards – They're Costing You
Are credit card points and rewards as beneficial as they seem, or are they secretly costing you more than you realize?
Doug Hoyes and Ted Michalos discuss the truths behind credit card reward systems, revealing how spending habits, annual fees, and high-interest cards may affect your financial health. Learn how credit card companies use your data, the psychology behind reward marketing, and whether these points and rewards are worth it in the long run. (0:50) – Have you changed your spending habits to earn credit card rewards? (1:45) – The history of credit card rewards: How they started and why they’re popular. (5:45) – What personal data do credit card companies collect from you? (8:00) – How credit card companies market rewards and why we’re drawn to them. (10:45) – The biggest downside of credit card rewards (12:55) – Doug and Ted's advice: How many credit cards should you really have? (14:30) – Why using high-interest cards can lead to risky financial situations. (18:20) – How credit card usage affects your credit score: Tips to avoid negative impacts. (22:30) – The overlooked cost of annual fees (26:00) – The risk of losing your points during insolvency: Use them, or lose them. (27:00) – Expert advice on credit card rewards Learn more from Hoyes Michalos: What Happens to Points If You Go Bankrupt? Pros and Cons of Preapproved Limit Increase Unpaid Credit Card Consequences Debt Repayment Calculator Debt To Income Ratio Calculator FREE Credit Rebuilding Course Sign Up for Our Newsletter HERE www.hoyes.com/subscribe-newsletter/ Watch the Hoyes Michalos YouTube Channel Hoyes Michalos Instagram Hoyes Michalos Facebook Hoyes Michalos TikTok Hoyes Michalos Twitter (X) Hoyes Michalos LinkedIn Straight Talk on Your Money by Doug Hoyes Find a Hoyes Michalos Office in Your Area Here
Disclaimer: The information provided in the Debt Free in 30 Podcast is for entertainment and informational purposes only and is not intended as personal financial advice. Individual financial situations vary and may require personalized advice from a qualified financial advisor. Always consult with a financial professional. The views expressed in this episode do not necessarily reflect the opinions of Hoyes, Michalos & Associates, or any other affiliated organizations. We do not endorse or guarantee the effectiveness of any specific financial institutions or strategies discussed.
Cordial Catholic, K Albert Little
Biblical Scholar Dr. John Bergsma explains why Peter’s Office Passed Down! #pope #apologetics
CKMS 102.7 FM: Radio Waterloo
Community Connections interview with MP Tim Louis about Bill C-355 – Bill to prohibit the air transport of horses for slaughter
CKMS 102.7 FM: Radio Waterloo
So Old It’s New set for Saturday, January 18, 2025
CKMS 102.7 FM: Radio Waterloo
CKMS News -2025-01-17- MT Space hosting “Works-in-Progress” mini festival this weekend
CKMS 102.7 FM: Radio Waterloo
CKMS News -2025-01-17- Art exhibition brings Palestinian culture and heritage to Kitchener’s city hall
The Sound Affect
Matt Hart from The Russian Futurists looks back on the 25th anniversary of their album The Method of Modern Love, and the influence of The Magnetic Fields.
25 years ago, the Toronto and Canadian music landscape was turned on its ear with the release of The Russian Futurists’ internationally acclaimed debut, The Method of Modern Love. The album is an unabashed confession of romance built on a bedrock of thrift-store toy keyboards and cheap guitars recorded on a porta studio. The Method of Modern Love won fans in R.E.M's Peter Buck, former Blur guitarist Graham Coxon, and filmmaker Jason Lee.
The Magnetic Fields' first two albums, Distant Plastic Trees and The Wayward Bus, were the template for The Russian Futurists and a change in the direction of Canadian indie music!
CKMS 102.7 FM: Radio Waterloo
NO CRAP RADIO VER. 4.96 Jan. 18/25 12AM
Communitech
Founders First: A Conversation with Ruth Casselman and Jennifer Gruber
CKMS 102.7 FM: Radio Waterloo
WaSun on the Regime – Interview about the newly released: EARTH MOTHER – best album
CKMS 102.7 FM: Radio Waterloo
!earshot Daily
CKMS 102.7 FM: Radio Waterloo
Through the Static Episode 50 – 15/01/25
CKMS 102.7 FM: Radio Waterloo
The Clean Up Hour, Mix 301
Dan Nedelko
YouTube is DOMINATING video and it's not even close. YouTube is going to dominate podcasting next.
CKMS 102.7 FM: Radio Waterloo
CKMS News -2025-01-15- Exploring Indigenous futurisms at WPL’s Indigenous reading circle
Cordial Catholic, K Albert Little
Martin Luther and the Reformers Were WRONG! (w/ The Catholic Brothers)
CIGI/The Logic Big Tech Podcast
This Mother Says a Chatbot Led to Her Son’s Death
In February, 2024, Megan Garcia’s 14-year-old son Sewell took his own life.
As she tried to make sense of what happened, Megan discovered that Sewell had fallen in love with a chatbot on Character.AI – an app where you can talk to chatbots designed to sound like historical figures or fictional characters. Now Megan is suing Character.AI, alleging that Sewell developed a “harmful dependency” on the chatbot that, coupled with a lack of safeguards, ultimately led to her son’s death.
They’ve also named Google in the suit, alleging that the technology that underlies Character.AI was developed while the founders were working at Google.
I sat down with Megan Garcia and her lawyer, Meetali Jain, to talk about what happened to Sewell. And to try to understand the broader implications of a world where chatbots are becoming a part of our lives – and the lives of our children.
We reached out to Character.AI and Google about this story. Google did not respond to our request for comment by publication time.
A spokesperson for Character.AI made the following statement:
“We do not comment on pending litigation.
Our goal is to provide a space that is both engaging and safe for our community. We are always working toward achieving that balance, as are many companies using AI across the industry. As part of this, we have launched a separate model for our teen users – with specific safety features that place more conservative limits on responses from the model.
The Character.AI experience begins with the Large Language Model that powers so many of our user and Character interactions. Conversations with Characters are driven by a proprietary model we continuously update and refine. For users under 18, we serve a version of the model that is designed to further reduce the likelihood of users encountering, or prompting the model to return, sensitive or suggestive content. This initiative – combined with the other techniques described below – combine to produce two distinct user experiences on the Character.AI platform: one for teens and one for adults.
Additional ways we have integrated safety across our platform include:
Model Outputs: A “classifier” is a method of distilling a content policy into a form used to identify potential policy violations. We employ classifiers to help us enforce our content policies and filter out sensitive content from the model’s responses. The under-18 model has additional and more conservative classifiers than the model for our adult users.
User Inputs: While much of our focus is on the model’s output, we also have controls to user inputs that seek to apply our content policies to conversations on Character.AI.This is critical because inappropriate user inputs are often what leads a language model to generate inappropriate outputs. For example, if we detect that a user has submitted content that violates our Terms of Service or Community Guidelines, that content will be blocked from the user’s conversation with the Character. We also have a process in place to suspend teens from accessing Character.AI if they repeatedly try to input prompts into the platform that violate our content policies.
Additionally, under-18 users are now only able to access a narrower set of searchable Characters on the platform. Filters have been applied to this set to remove Characters related to sensitive or mature topics.
We have also added a time spent notification and prominent disclaimers to make it clear that the Character is not a real person and should not be relied on as fact or advice. As we continue to invest in the platform, we will be rolling out several new features, including parental controls. For more information on these new features, please refer to the Character.AI blog HERE.
There is no ongoing relationship between Google and Character.AI. In August, 2024, Character.AI completed a one-time licensing of its technology and Noam went back to Google.”
If you or someone you know is thinking about suicide, support is available 24-7 by calling or texting 988, Canada’s national suicide prevention helpline.
Mentioned:
Megan Garcia v. Character Technologies, Et Al.
“Google Paid $2.7 Billion to Bring Back an AI Genius Who Quit in Frustration” by Miles Kruppa and Lauren Thomas
“Belgian man dies by suicide following exchanges with chatbot,” by Lauren Walker
“Can AI Companions Cure Loneliness?,” Machines Like Us
“An AI companion suggested he kill his parents. Now his mom is suing,” by Nitasha Tiku
Further Reading:
“Can A.I. Be Blamed for a Teen’s Suicide?” by Kevin Roose
“Margrethe Vestager Fought Big Tech and Won. Her Next Target is AI,” Machines Like Us
Mid-Credit Scene
Mid-Credit Minute - The Give and Take with Conway Fraser
When you love movies as much as Conway Fraser dies, it seems downright cruel to ask him to pick just one he feels he inherited from the past and just one to leave to the future. But we didn't claim we weren't cruel, so we asked him anyway. What did he pick for the Give and Take? Listen and find out - their classics and probably not ones you'd expect.
Newsletter: midcreditscene.substack.com/.
Email: midcreditscenepod@gmail.com
Show theme: The Show Must Be Go by Kevin MacLeod
Logo design: Jon Johnson