SCIENTIFIC SECTIONS
I. DEVELOPMENT OF NEW RESEARCH / INTERVENTION METHODS.
METHODOLOGY AND EXPERIMENTAL PSYCHOLOGY
- Human–AI interaction and trust in AI systems
- Cyberpsychology of gaming, online communities, and avatars
- Data Collection & Analysis
- Predictive models for mental health risk (e.g., depression, suicide, PTSD).
- Personality prediction from digital behaviors (texts, browsing, gaming).
- AI simulations of cognitive processes to test psychological theories.
- Automated participant matching and randomization.
- Adaptive experiments where AI adjusts conditions in real-time based on participant responses.
- Virtual reality and AI-driven avatars for controlled social interaction studies.
- AI-assisted diagnostics in psychological research.
- Personalized intervention testing (AI recommends treatments based on participant profiles).
- Digital mental health tools as both research instruments and interventions.
- Ethics in modern research.
- Bias in AI-driven psychological studies (training data reflecting social inequalities).
- Transparency and reproducibility issues in black-box AI models.
- The risk of over-reliance on algorithmic predictions over human interpretation.
- Bias risks in AI-based assessment tools (language, culture, socio-economic background).
- Assessment of the research process.
- Development of e-learning products.
- Multimedia tools in research.
- Transfer of technology in workgroup / organizations.
- Human computer interface (HCI) applications.
- Assessing the quality of human resources.
- Qualitative methods vs. quantitative methods in modern research.
II. RESEARCH AND EDUCATION. EDUCATIONAL PSYCHOLOGY
- AI tutors that adjust difficulty and content to each learner’s cognitive profile.
- Real-time feedback on attention, engagement, and emotional states through AI-driven eye-tracking or affective computing.
- Detecting learning disabilities early through behavioral and performance data.
- AI chatbots as learning companions to support motivation and reduce anxiety.
- Detecting stress, boredom, or cognitive overload in students via biometric and behavioral data.
- AI-driven interventions for improving memory, attention, and problem-solving.
- AI systems supporting teachers in managing diverse classrooms (identifying struggling learners, suggesting interventions).
- Risks of over-reliance on AI leading to reduced teacher-student emotional connection.
- Ethical challenges of “surveillance AI” monitoring students’ behavior.
- Risk of reinforcing educational inequality if AI tools favor resource-rich schools.
- Ethical implications of using student psychological data for predictive analytics.
- Balancing personalization with student privacy and autonomy.
- Effects of early screen exposure on child development.
- Gamification in education and its psychological impact.
- Emotional regulation in adolescents during the digital era.
- Parenting styles in multicultural and migrant families.
- The role of attachment in resilience and coping.
- Research for Lifelong Learning.
- Research and education for entrepreneurship.
- Curricula design for education.
- Strategies for quality improvement in education.
- Predicting academic success or dropout risk.
- Social and public responsibility in modern education.
III. WORK AND ORGANIZATIONAL PSYCHOLOGY. CAREER AND WORKPLACE ISSUES
- The impact of artificial intelligence on human cognition and decision-making.
- Automation of routine tasks → focus shifts to creativity, problem-solving, and emotional intelligence.
- Redefining job identity when employees work alongside AI/robots.
- Psychological effects of AI-driven restructuring and “technological unemployment.”
- AI-based applicant screening, personality prediction, and video-interview analysis.
- Risks of algorithmic bias (gender, race, age) in hiring decisions.
- Impact on applicant perceptions: fairness, transparency, and trust in AI hiring tools.
- AI-driven performance monitoring (productivity tracking, keystrokes, biometric data).
- Personalized training platforms adapting to employees’ learning styles.
- Employee resistance or acceptance of AI-based monitoring systems.
- Use of AI decision-support systems in management.
- Changes in leadership styles when leaders share decision-making with algorithms.
- Trust dynamics: when do employees accept or resist AI-led directives?
- Stress and anxiety from constant AI surveillance and productivity metrics.
- Fear of job loss vs. opportunities for job enrichment through AI support.
- The role of AI in promoting work–life balance (smart scheduling, wellness apps).
- How AI shapes workplace norms (data-driven vs. human-centered cultures).
- Ethical implications of using AI for workforce analytics (privacy, consent).
- Power dynamics: AI as a “neutral” decision-maker vs. tool reinforcing organizational biases.
- Aging in workgroups / organizations.
- Health promotion in workgroups / organizations.
- Balancing personal and professional life.
- Self-regulation and affirmation.
- Knowledge and skills. Performance and dynamic competence in workgroups and organizations.
- Coping with dysfunctional people or groups.
- Stress and emotions.
- Risk perception in workplaces.
- Communication. Social change.
- Organizational culture.
- Rules, regulations and values in organizations.
- Social support, coping and wellness.
- Innovation and creativity.
- Leadership competences.
IV. LIFE QUALITY. CLINICAL AND HEALTH PSYCHOLOGY
- Ethical challenges of AI in mental health diagnosis and therapy.
- Virtual reality (VR) as a tool for cognitive training and therapy.
- The neuroscience of mindfulness and meditation.
- Neuroplasticity and mental health recovery.
- Personalized mental health interventions using big data.
- The role of psychedelics in treating depression, PTSD, and addiction.
- Teletherapy effectiveness vs. face-to-face therapy.
- Psychological aspects of long-COVID and chronic illness.
- Trauma-informed care in diverse populations.
- Personality and health.
- Illness perception. Managing illness.
- Family and health.
- Occupational health.
- Culture and health.
- Health promotion in society.
- Traditional and modern approach in health assurance.
V. FORENSIC SCIENCES AND CRIMINAL INVESTIGATION.
FORENSIC PSYCHOLOGY AND PSYCHIATRY
- Forensic psychology in digital crime and cyberbullying cases.
- AI in Criminal Profiling & Risk Assessment.
- Use of AI to analyze micro-expressions, speech patterns, and physiological signals in forensic interviews.
- Comparing AI lie detection with polygraph and human assessment accuracy.
- AI detection of deepfakes and its forensic importance in court cases.
- The role of AI in jury selection and prediction of verdict tendencies.
- Ethical concerns about AI influencing judicial decisions.
- How AI-generated psychological reports could affect expert witness credibility.
- AI-assisted diagnosis of psychopathology in offenders.
- Predictive models for violence or self-harm in correctional facilities.
- The risks of over-reliance on AI in determining criminal responsibility.
- Privacy concerns AI-based psychological profiling.
- Potential bias (racial, gender, socio-economic) embedded in forensic AI tools.
- Standards for admissibility of AI-generated evidence in court.
- Criminal personality.
- Criminal behaviour.
- Criminal groups.
- Civil disputes.
- Criminal laws.
- Government regulations.
- Public health protection.
VI. MILITARY PSYCHOLOGY
- AI in Training & Simulation.
- AI chatbots/virtual therapists providing immediate psychological support in deployment zones.
- Predictive models for PTSD risk based on biometric, behavioral, and speech data.
- AI tools for early detection of depression, anxiety, or burnout in military personnel.
- Psychological effects of working alongside autonomous drones and robotic systems.
- Trust in AI decision-making during combat (when to defer to or override algorithms).
- Cognitive load and stress when soldiers must interpret AI recommendations in high-risk scenarios.
- Using AI to analyze cognitive, emotional, and personality data for soldier selection.
- Ethical implications of AI predicting combat suitability.
- Risks of algorithmic bias in recruitment and psychological screening.
- Combat & Ethical Decision-Making.
- AI-assisted therapy for PTSD and moral injury (chatbots, VR exposure therapy, personalized recovery plans).
- Long-term tracking of veterans’ psychological health using AI monitoring systems.
- Privacy and trust issues in using AI for veterans’ psychological care.
VII. RESEARCH IN SOCIAL AND CULTURAL SETTINGS. SOCIAL PSYCHOLOGY
- The psychology of social media use and digital identity.
- AI chatbots and virtual agents as new “social actors” → how people attribute emotions, trust, and empathy to machines.
- Parasocial relationships with AI (companions, therapy bots, gaming avatars).
- AI-mediated communication: how algorithms shape online interactions and social identity.
- How AI-curated feeds shape self-esteem, body image, and identity development.
- Use of AI-generated avatars and its effect on self-presentation and social comparison.
- Psychological effects of being judged or evaluated by AI (in hiring, dating, education).
- Algorithmic bias as a modern form of discrimination (gender, race, class).
- How exposure to biased AI systems influences human prejudice.
- Public perceptions of fairness when AI makes social or legal decisions.
- Spread of misinformation through AI-generated deepfakes.
- AI’s role in shaping collective action (political campaigns, activism).
- Trust in institutions using AI (government, healthcare, education).
- Using AI for large-scale analysis of social behavior (tweets, forums, video interactions).
- Virtual reality experiments with AI-driven agents to simulate group dynamics.
- Ethical risks of studying human behavior with AI-driven surveillance.
- Cross-cultural resilience and mental health in global crises.
- Psychology of diversity, inclusion, and implicit bias.
- Research design and data analysis.
- Cross-cultural research.
- Immigration issues.
- Individuals and groups.
- Interpersonal relationship.
- Group bevaviour.
- Intergroup relations.
- Norms and values.
- Social discrimination and stereotypes.
VIII. MISCELLANEOUS ...